831 resultados para Formal Methods. Component-Based Development. Competition. Model Checking
Resumo:
The purpose of this study was to investigate the effectiveness of implementing the Self-Regulated Strategy Development (SRSD) model of instruction (Graham & Harris, 2005; Harris & Graham, 1996) on the writing skills and writing self-regulation, attitudes, self-efficacy, and knowledge of 6 first grade students. A multiple-baseline design across participants with multiple probes (Kazdin, 2010) was used to test the effectiveness of the SRSD instructional intervention. Each participant was taught an SRSD story writing strategy as well as self-regulation strategies. All students wrote stories in response to picture prompts during the baseline, instruction, independent performance, and maintenance phases. Stories were assessed for essential story components, length, and overall quality. All participants also completed a writing attitude scale, a writing self-efficacy scale, and participated in brief interviews during the baseline and independent performance phases. Results indicated that SRSD can be beneficial for average first grade writers. Participants wrote stories that contained more essential components, were longer, and of better quality after SRSD instruction. Participants also showed some improvement in writing self-efficacy from pre- to post-instruction. All of the students maintained positive writing attitudes throughout the study.
Resumo:
OBJECTIVE: To assess the prevalence of asthma and risk factors associated in children and adolescents. METHODS: Population-based cross-sectional study with 1,185 female and male children and adolescents carried out in the city of Sao Paulo, Southeastern Brazil, from 2008 to 2009. Data were collected through home interviews. Respondents were selected from two-stage (census tract, household) cluster random sampling stratified by gender and age. Multiple Poisson regression was used in the adjusted analysis between the outcome and socioeconomic, demographic, lifestyle and health condition variables. RESULTS: Of all respondents, 9.1% (95%CI 7.0; 11.7) reported asthma. After adjustment, the following variables were found independently associated with asthma: age (0 to 4 years vs. 15 to 19) (PR 3.18, 95%CI 1.20;8.42); age (5 to 9 years vs. 15 to 19) (PR 6.37, 95%CI 2.64;15.39); age (10 to 14 years vs. 15 to 19) (PR 4.51,95%CI 1.95;10.40); allergy (yes vs. no) (PR 2.22, 95%CI 1.24;4.00); rhinitis (yes vs. no) (PR 2.13, 95%CI 1.22;3.73); health conditions in the 15 days preceding the interview (yes vs. no) (PR 1.96, 95%CI 1.23;3.11); number of rooms in the household (1 to 3 vs. 4 and more) (PR 1.67, 95%CI 1.05;2.66); and skin color (black and mixed vs. white) (PR 2.00, 95%CI 1.14;3.49). CONCLUSIONS: This study showed the importance of factors associated with asthma including rhinitis and allergy; age between 5 to 9 years old; black and mixed skin color; and household with few rooms. Frequent health problems are seen as a common consequence of asthma.
Resumo:
OBJECTIVES: Hemodynamic support is aimed at providing adequate O-2 delivery to the tissues; most interventions target O-2 delivery increase. Mixed venous O-2 saturation is a frequently used parameter to evaluate the adequacy of O-2 delivery. METHODS: We describe a mathematical model to compare the effects of increasing O-2 delivery on venous oxygen saturation through increases in the inspired O-2 fraction versus increases in cardiac output. The model was created based on the lungs, which were divided into shunted and non-shunted areas, and on seven peripheral compartments, each with normal values of perfusion, optimal oxygen consumption, and critical O-2 extraction rate. O-2 delivery was increased by changing the inspired fraction of oxygen from 0.21 to 1.0 in steps of 0.1 under conditions of low (2.0 L.min(-1)) or normal (6.5 L.min(-1)) cardiac output. The same O-2 delivery values were also obtained by maintaining a fixed O-2 inspired fraction value of 0.21 while changing cardiac output. RESULTS: Venous oxygen saturation was higher when produced through increases in inspired O-2 fraction versus increases in cardiac output, even at the same O-2 delivery and consumption values. Specifically, at high inspired O-2 fractions, the measured O-2 saturation values failed to detect conditions of low oxygen supply. CONCLUSIONS: The mode of O-2 delivery optimization, specifically increases in the fraction of inspired oxygen versus increases in cardiac output, can compromise the capability of the "venous O-2 saturation" parameter to measure the adequacy of oxygen supply. Consequently, venous saturation at high inspired O-2 fractions should be interpreted with caution.
Resumo:
This paper consists of a review of the literature about dividend policy in Brazil, focusing on the empirical studies conducted from 1990 to 2010 that were published in major Brazilian administration, accounting and finance journals and major conference proceedings on this subject. The analyzed sample comprised 39 studies using various methods and conducted in various periods. Based on the model of Harris and Raviv (1991), this paper grouped studies according to model type, and it found five main categories. We were able to find some tendencies, such as: dividend policy relevance in the Brazilian market; confirmation of existence of agency problems; conflicting findings regarding the clientele hypothesis; tax signaling in, and tax impact on, defining dividend policy; non-conflicting findings regarding key factors of dividend policy..
Resumo:
This article analyses the changes in Brazilian food retailing by investigating the co-existence of, and the pricing variation across, large supermarket chains and small independent supermarkets. It uses cointegration tests to show that, despite the widespread belief that small supermarkets are inefficient and charge higher prices, they in fact charge lower prices. Accordingly, in contrast to the prevailing literature on food-retail development, competition in food retail is complex and cannot be described as a simple Darwinian process of market concentration. The article explores the survival of small retail and its consequences for the current discussion on modern food retail in developing countries.
Resumo:
Background Cost-effectiveness studies have been increasingly part of decision processes for incorporating new vaccines into the Brazilian National Immunisation Program. This study aimed to evaluate the cost-effectiveness of 10-valent pneumococcal conjugate vaccine (PCV10) in the universal childhood immunisation programme in Brazil. Methods A decision-tree analytical model based on the ProVac Initiative pneumococcus model was used, following 25 successive cohorts from birth until 5 years of age. Two strategies were compared: (1) status quo and (2) universal childhood immunisation programme with PCV10. Epidemiological and cost estimates for pneumococcal disease were based on National Health Information Systems and literature. A 'top-down' costing approach was employed. Costs are reported in 2004 Brazilian reals. Costs and benefits were discounted at 3%. Results 25 years after implementing the PCV10 immunisation programme, 10 226 deaths, 360 657 disability-adjusted life years (DALYs), 433 808 hospitalisations and 5 117 109 outpatient visits would be avoided. The cost of the immunisation programme would be R$10 674 478 765, and the expected savings on direct medical costs and family costs would be R$1 036 958 639 and R$209 919 404, respectively. This resulted in an incremental cost-effectiveness ratio of R$778 145/death avoided and R$22 066/DALY avoided from the society perspective. Conclusion The PCV10 universal infant immunisation programme is a cost-effective intervention (1-3 GDP per capita/DALY avoided). Owing to the uncertain burden of disease data, as well as unclear long-term vaccine effects, surveillance systems to monitor the long-term effects of this programme will be essential.
Resumo:
Background Previous studies have established that mycobacterial infections ameliorate allergic inflammation. However, a non-infectious approach that controls allergic responses might represent a safer and more promising strategy. The 60-65 kDa heat shock protein (Hsp) family is endowed with anti-inflammatory properties, but it is still unclear whether and how single mycobacterial Hsp control allergic disorders. Objective Therefore, in this study we determined whether the administration of Mycobacterial leprae Hsp65 expressed by recombinant a DNA plasmid could attenuate a previously established allergic response. Methods We used an experimental model of airway allergic inflammation to test the effects of immunotherapy with DNA encoding Hsp65. Allergic mice, previously sensitized and challenged with ovalbumin, were treated with tree intramuscular doses of recombinant DNA encoding Hsp65. After treatment, mice received a second allergen challenge and the allergic response was measured. Results We found that immunotherapy attenuated eosinophilia, pulmonary inflammation, Th2 cytokine and mucus production. Moreover, we showed that the inhibition of allergic response is dependent on IL-10 production. Both Hsp65 and allergen-specific IL-10-producing cells contributed to this effect. Cells transferred from DNA-immunized mice to allergic mice migrated to allergic sites and down-modulated the Th2 response. Conclusions and Clinical Relevance Our findings clearly show that immunotherapy with DNA encoding Hsp65 can attenuate an established Th2 allergic inflammation through an IL-10-dependent mechanism; moreover, the migration of allergen-and Hsp65-specific cells to the allergic sites exerts a fundamental role. This work represents a novel contribution to the understanding of immune regulation by Hsp65 in allergic diseases.
Resumo:
Despite the great importance of soybeans in Brazil, there have been few applications of soybean crop modeling on Brazilian conditions. Thus, the objective of this study was to use modified crop models to estimate the depleted and potential soybean crop yield in Brazil. The climatic variable data used in the modified simulation of the soybean crop models were temperature, insolation and rainfall. The data set was taken from 33 counties (28 Sao Paulo state counties, and 5 counties from other states that neighbor São Paulo). Among the models, modifications in the estimation of the leaf area of the soybean crop, which includes corrections for the temperature, shading, senescence, CO2, and biomass partition were proposed; also, the methods of input for the model's simulation of the climatic variables were reconsidered. The depleted yields were estimated through a water balance, from which the depletion coefficient was estimated. It can be concluded that the adaptation soybean growth crop model might be used to predict the results of the depleted and potential yield of soybeans, and it can also be used to indicate better locations and periods of tillage.
Resumo:
Study IReal Wage Determination in the Swedish Engineering Industry This study uses the monopoly union model to examine the determination of real wages and in particular the effects of active labour market programmes (ALMPs) on real wages in the engineering industry. Quarterly data for the period 1970:1 to 1996:4 are used in a cointegration framework, utilising the Johansen's maximum likelihood procedure. On a basis of the Johansen (trace) test results, vector error correction (VEC) models are created in order to model the determination of real wages in the engineering industry. The estimation results support the presence of a long-run wage-raising effect to rises in the labour productivity, in the tax wedge, in the alternative real consumer wage and in real UI benefits. The estimation results also support the presence of a long-run wage-raising effect due to positive changes in the participation rates regarding ALMPs, relief jobs and labour market training. This could be interpreted as meaning that the possibility of being a participant in an ALMP increases the utility for workers of not being employed in the industry, which in turn could increase real wages in the industry in the long run. Finally, the estimation results show evidence of a long-run wage-reducing effect due to positive changes in the unemployment rate. Study IIIntersectoral Wage Linkages in Sweden The purpose of this study is to investigate whether the wage-setting in certain sectors of the Swedish economy affects the wage-setting in other sectors. The theoretical background is the Scandinavian model of inflation, which states that the wage-setting in the sectors exposed to international competition affects the wage-setting in the sheltered sectors of the economy. The Johansen maximum likelihood cointegration approach is applied to quarterly data on Swedish sector wages for the period 1980:1–2002:2. Different vector error correction (VEC) models are created, based on assumptions as to which sectors are exposed to international competition and which are not. The adaptability of wages between sectors is then tested by imposing restrictions on the estimated VEC models. Finally, Granger causality tests are performed in the different restricted/unrestricted VEC models to test for sector wage leadership. The empirical results indicate considerable adaptability in wages as between manufacturing, construction, the wholesale and retail trade, the central government sector and the municipalities and county councils sector. This is consistent with the assumptions of the Scandinavian model. Further, the empirical results indicate a low level of adaptability in wages as between the financial sector and manufacturing, and between the financial sector and the two public sectors. The Granger causality tests provide strong evidence for the presence of intersectoral wage causality, but no evidence of a wage-leading role in line with the assumptions of the Scandinavian model for any of the sectors. Study IIIWage and Price Determination in the Private Sector in Sweden The purpose of this study is to analyse wage and price determination in the private sector in Sweden during the period 1980–2003. The theoretical background is a variant of the “Imperfect competition model of inflation”, which assumes imperfect competition in the labour and product markets. According to the model wages and prices are determined as a result of a “battle of mark-ups” between trade unions and firms. The Johansen maximum likelihood cointegration approach is applied to quarterly Swedish data on consumer prices, import prices, private-sector nominal wages, private-sector labour productivity and the total unemployment rate for the period 1980:1–2003:3. The chosen cointegration rank of the estimated vector error correction (VEC) model is two. Thus, two cointegration relations are assumed: one for private-sector nominal wage determination and one for consumer price determination. The estimation results indicate that an increase of consumer prices by one per cent lifts private-sector nominal wages by 0.8 per cent. Furthermore, an increase of private-sector nominal wages by one per cent increases consumer prices by one per cent. An increase of one percentage point in the total unemployment rate reduces private-sector nominal wages by about 4.5 per cent. The long-run effects of private-sector labour productivity and import prices on consumer prices are about –1.2 and 0.3 per cent, respectively. The Rehnberg agreement during 1991–92 and the monetary policy shift in 1993 affected the determination of private-sector nominal wages, private-sector labour productivity, import prices and the total unemployment rate. The “offensive” devaluation of the Swedish krona by 16 per cent in 1982:4, and the start of a floating Swedish krona and the substantial depreciation of the krona at this time affected the determination of import prices.
Resumo:
[ES] El presente trabajo describe el desarrollo de una aplicación para el registro de las horas de docencia impartidas por el profesorado en la universidad. Con esto se persigue tener la información digitalizada para agilizar las gestiones que se tengan que realizar con ella. Por el lado del profesorado, se enviarán notificaciones vía correo electrónico para confirmar la docencia firmada, a modo de registro personal para que el profesor sepa la docencia que ha impartido y, en caso de sustitución, que también el profesor sustituido tenga constancia de la sustitución. El desarrollo se hará apoyándose en métodos ágiles, utilizando el desarrollo guiado por pruebas los módulos del modelo y persistencia.
Resumo:
The need for a convergence between semi-structured data management and Information Retrieval techniques is manifest to the scientific community. In order to fulfil this growing request, W3C has recently proposed XQuery Full Text, an IR-oriented extension of XQuery. However, the issue of query optimization requires the study of important properties like query equivalence and containment; to this aim, a formal representation of document and queries is needed. The goal of this thesis is to establish such formal background. We define a data model for XML documents and propose an algebra able to represent most of XQuery Full-Text expressions. We show how an XQuery Full-Text expression can be translated into an algebraic expression and how an algebraic expression can be optimized.
Resumo:
[EN] This paper presents a location–price equilibrium problem on a tree. A sufficient condition for having a Nash equilibrium in a spatial competition model that incorporates price, transport, and externality costs is given. This condition implies both competitors are located at the same point, a vertex that is the unique median of the tree. However, this is not an equilibrium necessary condition. Some examples show that not all medians are equilibria. Finally, an application to the Tenerife tram is presented.
Resumo:
Stocks’ overexploitation and socio-economic sustainability are two major issues currently at stake in European fisheries. In this view the European Commission is considering the implementation of management plans as a means to move towards a longer-term perspective on fisheries management, to consider regional differences and to increase stakeholder involvement. Adriatic small pelagic species (anchovies and sardines) are some of the most studied species in the world from a biologic perspective; several economic analysis have also been realised on Italian pelagic fishery; despite that, no complete bioeconomic modelization has been carried out yet considering all biologic, technical and economic issues. Bioeconomic models cannot be considered foolproof tools but are important implements to help decision makers and can supply a fundamental scientific basis for management plans. This research gathers all available information (from biologic, technologic and economic perspectives) in order to carry out a bioeconomic model of the Adriatic pelagic fishery. Different approaches are analyzed and some of them developed to highlight potential divergences in results, characteristics and implications. Growth, production and demand functions are estimated. A formal analysis about interaction and competition between Italian and Croatian fleet is examined proposing different equilibriums for open access, duopoly and a form of cooperative solution. Anyway normative judgments are limited because of poor knowledge of population dynamics and data related to the Croatian fleet.
Resumo:
Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.
Resumo:
Modern software systems, in particular distributed ones, are everywhere around us and are at the basis of our everyday activities. Hence, guaranteeing their cor- rectness, consistency and safety is of paramount importance. Their complexity makes the verification of such properties a very challenging task. It is natural to expect that these systems are reliable and above all usable. i) In order to be reliable, compositional models of software systems need to account for consistent dynamic reconfiguration, i.e., changing at runtime the communication patterns of a program. ii) In order to be useful, compositional models of software systems need to account for interaction, which can be seen as communication patterns among components which collaborate together to achieve a common task. The aim of the Ph.D. was to develop powerful techniques based on formal methods for the verification of correctness, consistency and safety properties related to dynamic reconfiguration and communication in complex distributed systems. In particular, static analysis techniques based on types and type systems appeared to be an adequate methodology, considering their success in guaranteeing not only basic safety properties, but also more sophisticated ones like, deadlock or livelock freedom in a concurrent setting. The main contributions of this dissertation are twofold. i) On the components side: we design types and a type system for a concurrent object-oriented calculus to statically ensure consistency of dynamic reconfigurations related to modifications of communication patterns in a program during execution time. ii) On the communication side: we study advanced safety properties related to communication in complex distributed systems like deadlock-freedom, livelock- freedom and progress. Most importantly, we exploit an encoding of types and terms of a typical distributed language, session π-calculus, into the standard typed π- calculus, in order to understand their expressive power.