904 resultados para calibration of rainfall-runoff models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the effect of blood absorption on the endogenous fluorescence signal intensity of biological tissues. Experimental studies were conducted to identify these effects. To register the fluorescence intensity, the fluorescence spectroscopy method was employed. The intensity of the blood flow was measured by laser Doppler flowmetry. We proposed one possible implementation of the Monte Carlo method for the theoretical analysis of the effect of blood on the fluorescence signals. The simulation is constructed as a four-layer skin optical model based on the known optical parameters of the skin with different levels of blood supply. With the help of the simulation, we demonstrate how the level of blood supply can affect the appearance of the fluorescence spectra. In addition, to describe the properties of biological tissue, which may affect the fluorescence spectra, we turned to the method of diffuse reflectance spectroscopy (DRS). Using the spectral data provided by the DRS, the tissue attenuation effect can be extracted and used to correct the fluorescence spectra.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A szerző az alkalmazott többszektoros modellezés területén a lineáris programozási modellektől a számszerűsített általános egyensúlyi modellekig végbement változásokat tekinti át. Egy rövid történeti visszapillantás után a lineáris programozás módszereire épülő nemzetgazdasági szintű modellekkel összevetve mutatja be az általános egyensúlyi modellek közös, illetve eltérő jellemzőit. Egyidejűleg azt is érzékelteti, hogyan lehet az általános egyensúlyi modelleket a gazdaságpolitikai célok konzisztenciájának, a célok közötti átváltási lehetőségek elemzésére és általában a gazdaságpolitikai elképzelések érzékenységi vizsgálatára felhasználni. A szerző az elméleti-módszertani kérdések taglalását számszerűsített általános egyensúlyi modell segítségével illusztrálja. _______ The author surveys the changes having taken place in the field of multi-sector modeling, from the linear programming models to the quantified general equilibrium models. After a brief historical retrospection he presents the common and different characteristic features of the general equilibrium models by comparing them with the national economic level models based on the methods of linear programming. He also makes clear how the general equilibrium models can be used for analysing the consistency of economic policy targets, for the investigation of trade-off possibilities among the targets and, in general, for sensitivity analyses of economic policy targets. The discussion of theoretical and methodological quuestions is illustrated by the author with the aid of a quantified general equilibrium model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Bázel–2. tőkeegyezmény bevezetését követően a bankok és hitelintézetek Magyarországon is megkezdték saját belső minősítő rendszereik felépítését, melyek karbantartása és fejlesztése folyamatos feladat. A szerző arra a kérdésre keres választ, hogy lehetséges-e a csőd-előrejelző modellek előre jelző képességét növelni a hagyományos matematikai-statisztikai módszerek alkalmazásával oly módon, hogy a modellekbe a pénzügyi mutatószámok időbeli változásának mértékét is beépítjük. Az empirikus kutatási eredmények arra engednek következtetni, hogy a hazai vállalkozások pénzügyi mutatószámainak időbeli alakulása fontos információt hordoz a vállalkozás jövőbeli fizetőképességéről, mivel azok felhasználása jelentősen növeli a csődmodellek előre jelző képességét. A szerző azt is megvizsgálja, hogy javítja-e a megfigyelések szélsőségesen magas vagy alacsony értékeinek modellezés előtti korrekciója a modellek klasszifikációs teljesítményét. ______ Banks and lenders in Hungary also began, after the introduction of the Basel 2 capital agreement, to build up their internal rating systems, whose maintenance and development are a continuing task. The author explores whether it is possible to increase the predictive capacity of business-failure forecasting models by traditional mathematical-cum-statistical means in such a way that they incorporate the measure of change in the financial indicators over time. Empirical findings suggest that the temporal development of the financial indicators of firms in Hungary carries important information about future ability to pay, since the predictive capacity of bankruptcy forecasting models is greatly increased by using such indicators. The author also examines whether the classification performance of the models can be improved by correcting for extremely high or low values before modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear multisectoral models have for long been applied in the Hungarian national economic planning. Price-quantity correspondences and interaction, however, cannot easily be taken into account in the traditional linear framework. Computable general equilibrium modelers in the West have developed techniques which use extensively price-quantity interdependences. However, since they are usually presented with the controversial strict neoclassical interpretation, the possibility of their adaptation to socialist planning models has been concaled. This paper reflects on some results of a research investigating the possible adaptation of eqailibrium modeling techniques to central planning models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Highways are generally designed to serve a mixed traffic flow that consists of passenger cars, trucks, buses, recreational vehicles, etc. The fact that the impacts of these different vehicle types are not uniform creates problems in highway operations and safety. A common approach to reducing the impacts of truck traffic on freeways has been to restrict trucks to certain lane(s) to minimize the interaction between trucks and other vehicles and to compensate for their differences in operational characteristics. ^ The performance of different truck lane restriction alternatives differs under different traffic and geometric conditions. Thus, a good estimate of the operational performance of different truck lane restriction alternatives under prevailing conditions is needed to help make informed decisions on truck lane restriction alternatives. This study develops operational performance models that can be applied to help identify the most operationally efficient truck lane restriction alternative on a freeway under prevailing conditions. The operational performance measures examined in this study include average speed, throughput, speed difference, and lane changes. Prevailing conditions include number of lanes, interchange density, free-flow speeds, volumes, truck percentages, and ramp volumes. ^ Recognizing the difficulty of collecting sufficient data for an empirical modeling procedure that involves a high number of variables, the simulation approach was used to estimate the performance values for various truck lane restriction alternatives under various scenarios. Both the CORSIM and VISSIM simulation models were examined for their ability to model truck lane restrictions. Due to a major problem found in the CORSIM model for truck lane modeling, the VISSIM model was adopted as the simulator for this study. ^ The VISSIM model was calibrated mainly to replicate the capacity given in the 2000 Highway Capacity Manual (HCM) for various free-flow speeds under the ideal basic freeway section conditions. Non-linear regression models for average speed, throughput, average number of lane changes, and speed difference between the lane groups were developed. Based on the performance models developed, a simple decision procedure was recommended to select the desired truck lane restriction alternative for prevailing conditions. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic.^ This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The composition and distribution of diatom algae inhabiting estuaries and coasts of the subtropical Americas are poorly documented, especially relative to the central role diatoms play in coastal food webs and to their potential utility as sentinels of environmental change in these threatened ecosystems. Here, we document the distribution of diatoms among the diverse habitat types and long environmental gradients represented by the shallow topographic relief of the South Florida, USA, coastline. A total of 592 species were encountered from 38 freshwater, mangrove, and marine locations in the Everglades wetland and Florida Bay during two seasonal collections, with the highest diversity occurring at sites of high salinity and low water column organic carbon concentration (WTOC). Freshwater, mangrove, and estuarine assemblages were compositionally distinct, but seasonal differences were only detected in mangrove and estuarine sites where solute concentration differed greatly between wet and dry seasons. Epiphytic, planktonic, and sediment assemblages were compositionally similar, implying a high degree of mixing along the shallow, tidal, and storm-prone coast. The relationships between diatom taxa and salinity, water total phosphorus (WTP), water total nitrogen (WTN), and WTOC concentrations were determined and incorporated into weighted averaging partial least squares regression models. Salinity was the most influential variable, resulting in a highly predictive model (r apparent 2  = 0.97, r jackknife 2  = 0.95) that can be used in the future to infer changes in coastal freshwater delivery or sea-level rise in South Florida and compositionally similar environments. Models predicting WTN (r apparent 2  = 0.75, r jackknife 2  = 0.46), WTP (r apparent 2  = 0.75, r jackknife 2  = 0.49), and WTOC (r apparent 2  = 0.79, r jackknife 2  = 0.57) were also strong, suggesting that diatoms can provide reliable inferences of changes in solute delivery to the coastal ecosystem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective for physics based modeling of the power converter components is to design the whole converter with respect to physical and operational constraints. Therefore, all the elements and components of the energy conversion system are modeled numerically and combined together to achieve the whole system behavioral model. Previously proposed high frequency (HF) models of power converters are based on circuit models that are only related to the parasitic inner parameters of the power devices and the connections between the components. This dissertation aims to obtain appropriate physics-based models for power conversion systems, which not only can represent the steady state behavior of the components, but also can predict their high frequency characteristics. The developed physics-based model would represent the physical device with a high level of accuracy in predicting its operating condition. The proposed physics-based model enables us to accurately develop components such as; effective EMI filters, switching algorithms and circuit topologies [7]. One of the applications of the developed modeling technique is design of new sets of topologies for high-frequency, high efficiency converters for variable speed drives. The main advantage of the modeling method, presented in this dissertation, is the practical design of an inverter for high power applications with the ability to overcome the blocking voltage limitations of available power semiconductor devices. Another advantage is selection of the best matching topology with inherent reduction of switching losses which can be utilized to improve the overall efficiency. The physics-based modeling approach, in this dissertation, makes it possible to design any power electronic conversion system to meet electromagnetic standards and design constraints. This includes physical characteristics such as; decreasing the size and weight of the package, optimized interactions with the neighboring components and higher power density. In addition, the electromagnetic behaviors and signatures can be evaluated including the study of conducted and radiated EMI interactions in addition to the design of attenuation measures and enclosures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

My thesis examines fine-scale habitat use and movement patterns of age 1 Greenland cod (Gadus macrocephalus ogac) tracked using acoustic telemetry. Recent advances in tracking technologies such as GPS and acoustic telemetry have led to increasingly large and detailed datasets that present new opportunities for researchers to address fine-scale ecological questions regarding animal movement and spatial distribution. There is a growing demand for home range models that will not only work with massive quantities of autocorrelated data, but that can also exploit the added detail inherent in these high-resolution datasets. Most published home range studies use radio-telemetry or satellite data from terrestrial mammals or avian species, and most studies that evaluate the relative performance of home range models use simulated data. In Chapter 2, I used actual field-collected data from age-1 Greenland cod tracked with acoustic telemetry to evaluate the accuracy and precision of six home range models: minimum convex polygons, kernel densities with plug-in bandwidth selection and the reference bandwidth, adaptive local convex hulls, Brownian bridges, and dynamic Brownian bridges. I then applied the most appropriate model to two years (2010-2012) of tracking data collected from 82 tagged Greenland cod tracked in Newman Sound, Newfoundland, Canada, to determine diel and seasonal differences in habitat use and movement patterns (Chapter 3). Little is known of juvenile cod ecology, so resolving these relationships will provide valuable insight into activity patterns, habitat use, and predator-prey dynamics, while filling a knowledge gap regarding the use of space by age 1 Greenland cod in a coastal nursery habitat. By doing so, my thesis demonstrates an appropriate technique for modelling the spatial use of fish from acoustic telemetry data that can be applied to high-resolution, high-frequency tracking datasets collected from mobile organisms in any environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Funded by European Union's Horizon 2020 Marie Sklodowska-Curie. Grant Number: 661211 Research Foundation Flanders (FWO). Grant Numbers: G.0055.08, G.0149.09, G.0308.13 FWO Research Network on Eco-Evolutionary dynamics French Ministère de l'Energie de l'Ecologie du Développement Durable et de la Mer through the EU FP6 BiodivERsA Eranet NERC. Grant Number: NE/J008001/1

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors would like to thank the College of Life Sciences of Aberdeen University and Marine Scotland Science which funded CP's PhD project. Skate tagging experiments were undertaken as part of Scottish Government project SP004. We thank Ian Burrett for help in catching the fish and the other fishermen and anglers who returned tags. We thank José Manuel Gonzalez-Irusta for extracting and making available the environmental layers used as environmental covariates in the environmental suitability modelling procedure. We also thank Jason Matthiopoulos for insightful suggestions on habitat utilization metrics as well as Stephen C.F. Palmer, and three anonymous reviewers for useful suggestions to improve the clarity and quality of the manuscript.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current interest in measuring quality of life is generating interest in the construction of computerized adaptive tests (CATs) with Likert-type items. Calibration of an item bank for use in CAT requires collecting responses to a large number of candidate items. However, the number is usually too large to administer to each subject in the calibration sample. The concurrent anchor-item design solves this problem by splitting the items into separate subtests, with some common items across subtests; then administering each subtest to a different sample; and finally running estimation algorithms once on the aggregated data array, from which a substantial number of responses are then missing. Although the use of anchor-item designs is widespread, the consequences of several configuration decisions on the accuracy of parameter estimates have never been studied in the polytomous case. The present study addresses this question by simulation, comparing the outcomes of several alternatives on the configuration of the anchor-item design. The factors defining variants of the anchor-item design are (a) subtest size, (b) balance of common and unique items per subtest, (c) characteristics of the common items, and (d) criteria for the distribution of unique items across subtests. The results of this study indicate that maximizing accuracy in item parameter recovery requires subtests of the largest possible number of items and the smallest possible number of common items; the characteristics of the common items and the criterion for distribution of unique items do not affect accuracy.