831 resultados para Formal Methods. Component-Based Development. Competition. Model Checking


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite significant progress in climate impacts research, the narratives that science can presently piece together of a 2-, 3-, 4-, or 5-degree warmer world remain fragmentary. Here we briefly review past undertakings to comprehensively characterize and quantify climate impacts based on multi-model approaches. We then report on the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP), a community-driven effort to systematically compare impacts models across sectors and scales, and to quantify the uncertainties along the chain from greenhouse gas emissions and climate input data to the modelling of climate impacts themselves. We show how ISI-MIP and similar efforts can substantially advance the science relevant to impacts, adaptation and vulnerability, and we outline the steps that need to be taken in order to make the most of available modelling tools. We discuss pertinent limitations of these methods and how they could be tackled. We argue that it is time to consolidate the current patchwork of impacts knowledge through integrated cross-sectoral assessments, and that the climate impacts community is now in a favourable position to do so.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Highly heterogeneous mountain snow distributions strongly affect soil moisture patterns; local ecology; and, ultimately, the timing, magnitude, and chemistry of stream runoff. Capturing these vital heterogeneities in a physically based distributed snow model requires appropriately scaled model structures. This work looks at how model scale—particularly the resolutions at which the forcing processes are represented—affects simulated snow distributions and melt. The research area is in the Reynolds Creek Experimental Watershed in southwestern Idaho. In this region, where there is a negative correlation between snow accumulation and melt rates, overall scale degradation pushed simulated melt to earlier in the season. The processes mainly responsible for snow distribution heterogeneity in this region—wind speed, wind-affected snow accumulations, thermal radiation, and solar radiation—were also independently rescaled to test process-specific spatiotemporal sensitivities. It was found that in order to accurately simulate snowmelt in this catchment, the snow cover needed to be resolved to 100 m. Wind and wind-affected precipitation—the primary influence on snow distribution—required similar resolution. Thermal radiation scaled with the vegetation structure (~100 m), while solar radiation was adequately modeled with 100–250-m resolution. Spatiotemporal sensitivities to model scale were found that allowed for further reductions in computational costs through the winter months with limited losses in accuracy. It was also shown that these modeling-based scale breaks could be associated with physiographic and vegetation structures to aid a priori modeling decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A truly variance-minimizing filter is introduced and its per for mance is demonstrated with the Korteweg– DeV ries (KdV) equation and with a multilayer quasigeostrophic model of the ocean area around South Africa. It is recalled that Kalman-like filters are not variance minimizing for nonlinear model dynamics and that four - dimensional variational data assimilation (4DV AR)-like methods relying on per fect model dynamics have dif- ficulty with providing error estimates. The new method does not have these drawbacks. In fact, it combines advantages from both methods in that it does provide error estimates while automatically having balanced states after analysis, without extra computations. It is based on ensemble or Monte Carlo integrations to simulate the probability density of the model evolution. When obser vations are available, the so-called importance resampling algorithm is applied. From Bayes’ s theorem it follows that each ensemble member receives a new weight dependent on its ‘ ‘distance’ ’ t o the obser vations. Because the weights are strongly var ying, a resampling of the ensemble is necessar y. This resampling is done such that members with high weights are duplicated according to their weights, while low-weight members are largely ignored. In passing, it is noted that data assimilation is not an inverse problem by nature, although it can be for mulated that way . Also, it is shown that the posterior variance can be larger than the prior if the usual Gaussian framework is set aside. However , i n the examples presented here, the entropy of the probability densities is decreasing. The application to the ocean area around South Africa, gover ned by strongly nonlinear dynamics, shows that the method is working satisfactorily . The strong and weak points of the method are discussed and possible improvements are proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study proposes a model of how deeply held beliefs, known as ‘social axioms, moderate the interaction between reputation, its causes and consequences with stakeholders. It contributes to the stakeholder relational field of reputation theory by explaining why the same organizational stimuli lead to different individual stakeholder responses. The study provides a shift in reputation research from organizational-level stimuli as the root causes of stakeholder responses to exploring the interaction between individual beliefs and organizational stimuli in determining reputational consequences. Building on a conceptual model that incorporates product/service quality and social responsibility as key reputational dimensions, the authors test empirically for moderating influences, in the form of social axioms, between reputation-related antecedents and consequences, using component-based structural equation modelling (n = 204). In several model paths, significant differences are found between responses of individuals identified as either high or low on social cynicism, fate control and religiosity. The results suggest that stakeholder responses to reputation-related stimuli can be systematically predicted as a function of the interactions between the deeply held beliefs of individuals and these stimuli. The authors offer recommendations on how strategic reputation management can be approached within and across stakeholder groups at a time when firms grapple with effective management of diverse stakeholder expectations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis considers Participatory Crop Improvement (PCI) methodologies and examines the reasons behind their continued contestation and limited mainstreaming in conventional modes of crop improvement research within National Agricultural Research Systems (NARS). In particular, it traces the experiences of a long-established research network with over 20 years of experience in developing and implementing PCI methods across South Asia, and specifically considers its engagement with the Indian NARS and associated state-level agricultural research systems. In order to address the issues surrounding PCI institutionalisation processes, a novel conceptual framework was derived from a synthesis of the literatures on Strategic Niche Management (SNM) and Learning-based Development Approaches (LBDA) to analyse the socio-technical processes and structures which constitute the PCI ‘niche’ and NARS ‘regime’. In examining the niche and regime according to their socio-technical characteristics, the framework provides explanatory power for understanding the nature of their interactions and the opportunities and barriers that exist with respect to the translation of lessons and ideas between niche and regime organisations. The research shows that in trying to institutionalise PCI methods and principles within NARS in the Indian context, PCI proponents have encountered a number of constraints related to the rigid and hierarchical structure of the regime organisations; the contractual mode of most conventional research, which inhibits collaboration with a wider group of stakeholders; and the time-limited nature of PCI projects themselves, which limits investment and hinders scaling up of the innovations. It also reveals that while the niche projects may be able to induce a ‘weak’ form of PCI institutionalisation within the Indian NARS by helping to alter their institutional culture to be more supportive of participatory plant breeding approaches and future collaboration with PCI researchers, a ‘strong’ form of PCI institutionalisation, in which NARS organisations adopt participatory methodologies to address all their crop improvement agenda, is likely to remain outside of the capacity of PCI development projects to deliver.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study is based on 141 pregnant Bos indicus cows, from days 20 to 70 post-insemination. First, special attention was given to the macroscopically observable phenomena of attachment of the conceptus to the uterus, i.e. the implantation, from about days 20 to 30 post-insemination up to day 70, and placentome development by growth, vascularization and increase in the number of cotyledons opposite to the endometrial caruncles. Secondly, as for the conceptuses, semiquantitative, statistical analyses were performed of the lengths of chorio-allantois, amnion and yolk sac; and the different parts of the centre and two extremes of the yolk sacs were also analysed. Thirdly, the embryos/foetuses corresponding to their membranes were measured by their greatest length and by weight, and described by the appearance of external developmental phenomena during the investigated period like neurulation, somites, branchial arcs, brain vesicles, limb buds, C-form, pigmented eye and facial grooves. In conclusion, all the data collected in this study from days 20 to 70 of bovine pregnancy were compared extensively with corresponding data of the literature. This resulted in an `embryo/foetal age-scale`, which has extended the data in the literature by covering the first 8 to 70 days of pregnancy. This age-scale of early bovine intrauterine development provides model for studies, even when using slaughtered cows without distinct knowledge of insemination or fertilization time, through macroscopic techniques. This distinctly facilitates research into the cow, which is now being widely used as `an experimental animal` for testing new techniques of reproduction like in vitro fertilization, embryo transfer and cloning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Densities and viscosities of five vegetable oils (Babassu oil, Buriti oil, Brazil nut oil, macadamia oil, and grape seed oil) and of three blends of Buriti oil and soybean oil were measured as a function of temperature and correlated by empirical equations. The estimation capability of two types of predictive methodologies was tested using the measured data. The first group of methods was based on the fatty acid composition of the oils, while the other was based on their triacylglycerol composition, as a multicomponent system. In general, the six models tested presented a good representation of the physical properties considered in this work. A simple method of calculation is also proposed to predict the dynamic viscosity of methyl and ethyl ester biodiesels, based on the fatty acid composition of the original oil. Data presented in this work and the developed model can be valuable for designing processes and equipment for the edible oil industry and for biodiesel production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we introduce a Bayesian analysis for bioequivalence data assuming multivariate pharmacokinetic measures. With the introduction of correlation parameters between the pharmacokinetic measures or between the random effects in the bioequivalence models, we observe a good improvement in the bioequivalence results. These results are of great practical interest since they can yield higher accuracy and reliability for the bioequivalence tests, usually assumed by regulatory offices. An example is introduced to illustrate the proposed methodology by comparing the usual univariate bioequivalence methods with multivariate bioequivalence. We also consider some usual existing discrimination Bayesian methods to choose the best model to be used in bioequivalence studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work describes the development and optimization of a sequential injection method to automate the determination of paraquat by square-wave voltammetry employing a hanging mercury drop electrode. Automation by sequential injection enhanced the sampling throughput, improving the sensitivity and precision of the measurements as a consequence of the highly reproducible and efficient conditions of mass transport of the analyte toward the electrode surface. For instance, 212 analyses can be made per hour if the sample/standard solution is prepared off-line and the sequential injection system is used just to inject the solution towards the flow cell. In-line sample conditioning reduces the sampling frequency to 44 h(-1). Experiments were performed in 0.10 M NaCl, which was the carrier solution, using a frequency of 200 Hz, a pulse height of 25 mV, a potential step of 2 mV, and a flow rate of 100 mu L s(-1). For a concentration range between 0.010 and 0.25 mg L(-1), the current (i(p), mu A) read at the potential corresponding to the peak maximum fitted the following linear equation with the paraquat concentration (mg L(-1)): ip = (-20.5 +/- 0.3) Cparaquat -(0.02 +/- 0.03). The limits of detection and quantification were 2.0 and 7.0 mu g L(-1), respectively. The accuracy of the method was evaluated by recovery studies using spiked water samples that were also analyzed by molecular absorption spectrophotometry after reduction of paraquat with sodium dithionite in an alkaline medium. No evidence of statistically significant differences between the two methods was observed at the 95% confidence level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to develop a fast capillary electrophoresis method for the determination of propranolol in pharmaceutical preparations. In the method development the pH and constituents of the background electrolyte were selected using the effective mobility versus pH curves. Benzylamine was used as the internal standard. The background electrolyte was composed of 60 mmol L(-1) tris(hydroxymethyl)aminomethane and 30 mmol L(-1) 2-hydroxyisobutyric acid,at pH 8.1. Separation was conducted in a fused-silica capillary (32 cm total length and 8.5 cm effective length, 50 mu m I.D.) with a short-end injection configuration and direct UV detection at 214 nm. The run time was only 14 s. Three different strategies were studied in order to develop a fast CE method with low total analysis time for propranolol analysis: low flush time (Lflush) 35 runs/h, without flush (Wflush) 52 runs/h, and Invert (switched polarity) 45 runs/h. Since the three strategies developed are statistically equivalent, Mush was selected due to the higher analytical frequency in comparison with the other methods. A few figures of merit of the proposed method include: good linearity (R(2) > 0.9999); limit of detection of 0.5 mg L(-1): inter-day precision better than 1.03% (n = 9) and recovery in the range of 95.1-104.5%. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Voice processing in real-time is challenging. A drawback of previous work for Hypokinetic Dysarthria (HKD) recognition is the requirement of controlled settings in a laboratory environment. A personal digital assistant (PDA) has been developed for home assessment of PD patients. The PDA offers sound processing capabilities, which allow for developing a module for recognition and quantification HKD. Objective: To compose an algorithm for assessment of PD speech severity in the home environment based on a review synthesis. Methods: A two-tier review methodology is utilized. The first tier focuses on real-time problems in speech detection. In the second tier, acoustics features that are robust to medication changes in Levodopa-responsive patients are investigated for HKD recognition. Keywords such as Hypokinetic Dysarthria , and Speech recognition in real time were used in the search engines. IEEE explorer produced the most useful search hits as compared to Google Scholar, ELIN, EBRARY, PubMed and LIBRIS. Results: Vowel and consonant formants are the most relevant acoustic parameters to reflect PD medication changes. Since relevant speech segments (consonants and vowels) contains minority of speech energy, intelligibility can be improved by amplifying the voice signal using amplitude compression. Pause detection and peak to average power rate calculations for voice segmentation produce rich voice features in real time. Enhancements in voice segmentation can be done by inducing Zero-Crossing rate (ZCR). Consonants have high ZCR whereas vowels have low ZCR. Wavelet transform is found promising for voice analysis since it quantizes non-stationary voice signals over time-series using scale and translation parameters. In this way voice intelligibility in the waveforms can be analyzed in each time frame. Conclusions: This review evaluated HKD recognition algorithms to develop a tool for PD speech home-assessment using modern mobile technology. An algorithm that tackles realtime constraints in HKD recognition based on the review synthesis is proposed. We suggest that speech features may be further processed using wavelet transforms and used with a neural network for detection and quantification of speech anomalies related to PD. Based on this model, patients' speech can be automatically categorized according to UPDRS speech ratings.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the increase in water demand and hydropower energy, it is getting more important to operate hydraulic structures in an efficient manner while sustaining multiple demands. Especially, companies, governmental agencies, consultant offices require effective, practical integrated tools and decision support frameworks to operate reservoirs, cascades of run-of-river plants and related elements such as canals by merging hydrological and reservoir simulation/optimization models with various numerical weather predictions, radar and satellite data. The model performance is highly related with the streamflow forecast, related uncertainty and its consideration in the decision making. While deterministic weather predictions and its corresponding streamflow forecasts directly restrict the manager to single deterministic trajectories, probabilistic forecasts can be a key solution by including uncertainty in flow forecast scenarios for dam operation. The objective of this study is to compare deterministic and probabilistic streamflow forecasts on an earlier developed basin/reservoir model for short term reservoir management. The study is applied to the Yuvacık Reservoir and its upstream basin which is the main water supply of Kocaeli City located in the northwestern part of Turkey. The reservoir represents a typical example by its limited capacity, downstream channel restrictions and high snowmelt potential. Mesoscale Model 5 and Ensemble Prediction System data are used as a main input and the flow forecasts are done for 2012 year using HEC-HMS. Hydrometeorological rule-based reservoir simulation model is accomplished with HEC-ResSim and integrated with forecasts. Since EPS based hydrological model produce a large number of equal probable scenarios, it will indicate how uncertainty spreads in the future. Thus, it will provide risk ranges in terms of spillway discharges and reservoir level for operator when it is compared with deterministic approach. The framework is fully data driven, applicable, useful to the profession and the knowledge can be transferred to other similar reservoir systems.