890 resultados para Context Model
Resumo:
The Short-term Water Information and Forecasting Tools (SWIFT) is a suite of tools for flood and short-term streamflow forecasting, consisting of a collection of hydrologic model components and utilities. Catchments are modeled using conceptual subareas and a node-link structure for channel routing. The tools comprise modules for calibration, model state updating, output error correction, ensemble runs and data assimilation. Given the combinatorial nature of the modelling experiments and the sub-daily time steps typically used for simulations, the volume of model configurations and time series data is substantial and its management is not trivial. SWIFT is currently used mostly for research purposes but has also been used operationally, with intersecting but significantly different requirements. Early versions of SWIFT used mostly ad-hoc text files handled via Fortran code, with limited use of netCDF for time series data. The configuration and data handling modules have since been redesigned. The model configuration now follows a design where the data model is decoupled from the on-disk persistence mechanism. For research purposes the preferred on-disk format is JSON, to leverage numerous software libraries in a variety of languages, while retaining the legacy option of custom tab-separated text formats when it is a preferred access arrangement for the researcher. By decoupling data model and data persistence, it is much easier to interchangeably use for instance relational databases to provide stricter provenance and audit trail capabilities in an operational flood forecasting context. For the time series data, given the volume and required throughput, text based formats are usually inadequate. A schema derived from CF conventions has been designed to efficiently handle time series for SWIFT.
Resumo:
Due to the increase in water demand and hydropower energy, it is getting more important to operate hydraulic structures in an efficient manner while sustaining multiple demands. Especially, companies, governmental agencies, consultant offices require effective, practical integrated tools and decision support frameworks to operate reservoirs, cascades of run-of-river plants and related elements such as canals by merging hydrological and reservoir simulation/optimization models with various numerical weather predictions, radar and satellite data. The model performance is highly related with the streamflow forecast, related uncertainty and its consideration in the decision making. While deterministic weather predictions and its corresponding streamflow forecasts directly restrict the manager to single deterministic trajectories, probabilistic forecasts can be a key solution by including uncertainty in flow forecast scenarios for dam operation. The objective of this study is to compare deterministic and probabilistic streamflow forecasts on an earlier developed basin/reservoir model for short term reservoir management. The study is applied to the Yuvacık Reservoir and its upstream basin which is the main water supply of Kocaeli City located in the northwestern part of Turkey. The reservoir represents a typical example by its limited capacity, downstream channel restrictions and high snowmelt potential. Mesoscale Model 5 and Ensemble Prediction System data are used as a main input and the flow forecasts are done for 2012 year using HEC-HMS. Hydrometeorological rule-based reservoir simulation model is accomplished with HEC-ResSim and integrated with forecasts. Since EPS based hydrological model produce a large number of equal probable scenarios, it will indicate how uncertainty spreads in the future. Thus, it will provide risk ranges in terms of spillway discharges and reservoir level for operator when it is compared with deterministic approach. The framework is fully data driven, applicable, useful to the profession and the knowledge can be transferred to other similar reservoir systems.
Resumo:
A three-dimensional time-dependent hydrodynamic and heat transport model of Lake Binaba, a shallow and small dam reservoir in Ghana, emphasizing the simulation of dynamics and thermal structure has been developed. Most numerical studies of temperature dynamics in reservoirs are based on one- or two-dimensional models. These models are not applicable for reservoirs characterized with complex flow pattern and unsteady heat exchange between the atmosphere and water surface. Continuity, momentum and temperature transport equations have been solved. Proper assignment of boundary conditions, especially surface heat fluxes, has been found crucial in simulating the lake’s hydrothermal dynamics. This model is based on the Reynolds Average Navier-Stokes equations, using a Boussinesq approach, with a standard k − ε turbulence closure to solve the flow field. The thermal model includes a heat source term, which takes into account the short wave radiation and also heat convection at the free surface, which is function of air temperatures, wind velocity and stability conditions of atmospheric boundary layer over the water surface. The governing equations of the model have been solved by OpenFOAM; an open source, freely available CFD toolbox. As its core, OpenFOAM has a set of efficient C++ modules that are used to build solvers. It uses collocated, polyhedral numerics that can be applied on unstructured meshes and can be easily extended to run in parallel. A new solver has been developed to solve the hydrothermal model of lake. The simulated temperature was compared against a 15 days field data set. Simulated and measured temperature profiles in the probe locations show reasonable agreement. The model might be able to compute total heat storage of water bodies to estimate evaporation from water surface.
Resumo:
Canada releases over 150 billion litres of untreated and undertreated wastewater into the water environment every year1. To clean up urban wastewater, new Federal Wastewater Systems Effluent Regulations (WSER) on establishing national baseline effluent quality standards that are achievable through secondary wastewater treatment were enacted on July 18, 2012. With respect to the wastewater from the combined sewer overflows (CSO), the Regulations require the municipalities to report the annual quantity and frequency of effluent discharges. The City of Toronto currently has about 300 CSO locations within an area of approximately 16,550 hectares. The total sewer length of the CSO area is about 3,450 km and the number of sewer manholes is about 51,100. A system-wide monitoring of all CSO locations has never been undertaken due to the cost and practicality. Instead, the City has relied on estimation methods and modelling approaches in the past to allow funds that would otherwise be used for monitoring to be applied to the reduction of the impacts of the CSOs. To fulfill the WSER requirements, the City is now undertaking a study in which GIS-based hydrologic and hydraulic modelling is the approach. Results show the usefulness of this for 1) determining the flows contributing to the combined sewer system in the local and trunk sewers for dry weather flow, wet weather flow, and snowmelt conditions; 2) assessing hydraulic grade line and surface water depth in all the local and trunk sewers under heavy rain events; 3) analysis of local and trunk sewer capacities for future growth; and 4) reporting of the annual quantity and frequency of CSOs as per the requirements in the new Regulations. This modelling approach has also allowed funds to be applied toward reducing and ultimately eliminating the adverse impacts of CSOs rather than expending resources on unnecessary and costly monitoring.
Resumo:
Recently, two international standard organizations, ISO and OGC, have done the work of standardization for GIS. Current standardization work for providing interoperability among GIS DB focuses on the design of open interfaces. But, this work has not considered procedures and methods for designing river geospatial data. Eventually, river geospatial data has its own model. When we share the data by open interface among heterogeneous GIS DB, differences between models result in the loss of information. In this study a plan was suggested both to respond to these changes in the information envirnment and to provide a future Smart River-based river information service by understanding the current state of river geospatial data model, improving, redesigning the database. Therefore, primary and foreign key, which can distinguish attribute information and entity linkages, were redefined to increase the usability. Database construction of attribute information and entity relationship diagram have been newly redefined to redesign linkages among tables from the perspective of a river standard database. In addition, this study was undertaken to expand the current supplier-oriented operating system to a demand-oriented operating system by establishing an efficient management of river-related information and a utilization system, capable of adapting to the changes of a river management paradigm.
Resumo:
In the field of operational water management, Model Predictive Control (MPC) has gained popularity owing to its versatility and flexibility. The MPC controller, which takes predictions, time delay and uncertainties into account, can be designed for multi-objective management problems and for large-scale systems. Nonetheless, a critical obstacle, which needs to be overcome in MPC, is the large computational burden when a large-scale system is considered or a long prediction horizon is involved. In order to solve this problem, we use an adaptive prediction accuracy (APA) approach that can reduce the computational burden almost by half. The proposed MPC scheme with this scheme is tested on the northern Dutch water system, which comprises Lake IJssel, Lake Marker, the River IJssel and the North Sea Canal. The simulation results show that by using the MPC-APA scheme, the computational time can be reduced to a large extent and a flood protection problem over longer prediction horizons can be well solved.
Resumo:
Since Henry George (1839-1897) economists have been arguing that a tax on unimproved land is an ideal tax on efficiency grounds. Output taxes, on the other hand, have distortionary effects on the economy. This paper shows that under asymmetric information output taxes might be used along with land tax in order to implement an optimal taxation scheme in a Latin American context, i.e., where land rental markets are relatively thin, land property provides non-agricultural payoffs and there is nonrevenue objectives of land taxation. Also, the model has two implications that can be tested empirically: (i) there is evasion when schemes based only on land taxes are implemented; (ii) this evasion is more severe for large landholders.
Resumo:
This research is in the domains of materialism, consumer vulnerability and consumption indebtedness, concepts frequently approached in the literature on consumer behavior, macro-marketing and economic psychology. The influence of materialism on consumer indebtedness is investigated within a context that is characterized by poverty and by factors that cause vulnerability, such as high interest rates, limited access to credit and to quality affordable goods. The objectives of this research are: to produce a materialism scale that is well adapted to its environment, characterizing materialism adequately for the population studied; to compare results obtained with results of other studies; and to measure the relationship between materialism, socio-demographic variables, attitude to debt and consumption indebtedness. The primary data used in the analyses were collected from field research carried out in August, 2005 that relied on a probabilistic household sample of 450 low income individuals who live in poor regions of the city of Sao Paulo. The materialism scale, adapted and translated into Portuguese from Richins (2004), proved to be very successful and encourages new work in the area. It was noted that younger adults tend to be more materialistic than older ones; that illiterate adults tend to be less materialistic than those who did literacy courses when they were already adults; and that gender, income and race are not associated with the materialism construct. Among the other results, a logistic regression model was developed in order to distinguish those individuals who have an installment plan payment booklet from those who do not, based on materialism, socio-demographic variables and purchasing and consumer habits. The proposed model confirms materialism as a behavioral variable useful for forecasting the probability of an individual getting into debt in order to consume, in some cases almost doubling the chance of occurrence of this event. Findings confirm the thesis that it is not only adverse economic factors that lead people to get into debt; and that the study of demand for credit for consumption purposes must, of necessity, include variables of a psychological nature. It is suggested that the low income materialistic consumer experiences feelings of powerlessness and exclusion because of the gap that exists between their possessions and their desires. Lines of conduct to combat this marginalization from the consumer society are drawn targeting marketing professionals, public policy makers and vulnerability researchers. Finally, the possibility of new studies involving the materialism construct, which is central to literature on consumer behavior, albeit little used in empirical studies in Brazil, are discussed.
Resumo:
This paper has two original contributions. First, we show that the present value model (PVM hereafter), which has a wide application in macroeconomics and fi nance, entails common cyclical feature restrictions in the dynamics of the vector error-correction representation (Vahid and Engle, 1993); something that has been already investigated in that VECM context by Johansen and Swensen (1999, 2011) but has not been discussed before with this new emphasis. We also provide the present value reduced rank constraints to be tested within the log-linear model. Our second contribution relates to forecasting time series that are subject to those long and short-run reduced rank restrictions. The reason why appropriate common cyclical feature restrictions might improve forecasting is because it finds natural exclusion restrictions preventing the estimation of useless parameters, which would otherwise contribute to the increase of forecast variance with no expected reduction in bias. We applied the techniques discussed in this paper to data known to be subject to present value restrictions, i.e. the online series maintained and up-dated by Shiller. We focus on three different data sets. The fi rst includes the levels of interest rates with long and short maturities, the second includes the level of real price and dividend for the S&P composite index, and the third includes the logarithmic transformation of prices and dividends. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to them. Moreover, imposing short-run restrictions produce forecast winners 70% of the time for target variables of PVMs and 63.33% of the time when all variables in the system are considered.
Resumo:
This paper uses an output oriented Data Envelopment Analysis (DEA) measure of technical efficiency to assess the technical efficiencies of the Brazilian banking system. Four approaches to estimation are compared in order to assess the significance of factors affecting inefficiency. These are nonparametric Analysis of Covariance, maximum likelihood using a family of exponential distributions, maximum likelihood using a family of truncated normal distributions, and the normal Tobit model. The sole focus of the paper is on a combined measure of output and the data analyzed refers to the year 2001. The factors of interest in the analysis and likely to affect efficiency are bank nature (multiple and commercial), bank type (credit, business, bursary and retail), bank size (large, medium, small and micro), bank control (private and public), bank origin (domestic and foreign), and non-performing loans. The latter is a measure of bank risk. All quantitative variables, including non-performing loans, are measured on a per employee basis. The best fits to the data are provided by the exponential family and the nonparametric Analysis of Covariance. The significance of a factor however varies according to the model fit although it can be said that there is some agreements between the best models. A highly significant association in all models fitted is observed only for nonperforming loans. The nonparametric Analysis of Covariance is more consistent with the inefficiency median responses observed for the qualitative factors. The findings of the analysis reinforce the significant association of the level of bank inefficiency, measured by DEA residuals, with the risk of bank failure.
Resumo:
Since some years, mobile technologies in healthcare (mHealth) stand for the transformational force to improve health issues in low- and middle-income countries (LMICs). Although several studies have identified the prevailing issue of inconsistent evidence and new evaluation frameworks have been proposed, few have explored the role of entrepreneurship to create disruptive change in a traditionally conservative sector. I argue that improving the effectiveness of mHealth entrepreneurs might increase the adoption of mHealth solutions. Thus, this study aims at proposing a managerial model for the analysis of mHealth solutions from the entrepreneurial perspective in the context of LMICs. I identified the Khoja–Durrani–Scott (KDS) framework as theoretical basis for the managerial model, due to its explicit focus on the context of LMICs. In the subsequent exploratory research I, first, used semi-structured interviews with five specialists in mHealth, local healthcare systems and investment to identify necessary adaptations to the model. The findings of the interviews proposed that especially the economic theme had to be clarified and an additional entrepreneurial theme was necessary. Additionally, an evaluation questionnaire was proposed. In the second phase, I applied the questionnaire to five start-ups, operating in Brazil and Tanzania, and conducted semi-structured interviews with the entrepreneurs to gain practical insights for the theoretical development. Three of five entrepreneurs perceived that the results correlated with the entrepreneurs' expectations of the strengths and weaknesses of the start-ups. Main shortcomings of the model related to the ambiguity of some questions. In addition to the findings for the model, the results of the scores were analyzed. The analysis suggested that across the participating mHealth start-ups the ‘behavioral and socio-technical’ outcomes were the strongest and the ‘policy’ outcomes were the weakest themes. The managerial model integrates several perspectives, structured around the entrepreneur. In order to validate the model, future research may link the development of a start-up with the evolution of the scores in longitudinal case studies or large-scale tests.
Resumo:
Reviewing the de nition and measurement of speculative bubbles in context of contagion, this paper analyses the DotCom bubble in American and European equity markets using the dynamic conditional correlation (DCC) model proposed by (Engle and Sheppard 2001) as on one hand as an econometrics explanation and on the other hand the behavioral nance as an psychological explanation. Contagion is de ned in this context as the statistical break in the computed DCCs as measured by the shifts in their means and medians. Even it is astonishing, that the contagion is lower during price bubbles, the main nding indicates the presence of contagion in the di¤erent indices among those two continents and proves the presence of structural changes during nancial crisis
Resumo:
With the current proliferation of sensor equipped mobile devices such as smartphones and tablets, location aware services are expanding beyond the mere efficiency and work related needs of users, evolving in order to incorporate fun, culture and the social life of users. Today people on the move have more and more connectivity and are expected to be able to communicate with their usual and familiar social networks. That means communications not only with their peers and colleagues, friends and family but also with unknown people that might share their interests, curiosities or happen to use the same social network. Through social networks, location aware blogging, cultural mobile applications relevant information is now available at specific geographical locations and open to feedback and conversations among friends as well as strangers. In fact, nowadays smartphone technologies aloud users to post and retrieve content while on the move, often relating to specific physical landmarks or locations, engaging and being engaged in conversations with strangers as much as their own social network. The use of such technologies and applications while on the move can often lead people to serendipitous discoveries and interactions. Throughout our thesis we are engaging on a two folded investigation: how can we foster and support serendipitous discoveries and what are the best interfaces for it? In fact, to read and write content while on the move is a cognitively intensive task. While the map serves the function of orienting the user, it also absorbs most of the user’s concentration. In order to address this kind of cognitive overload issue with Breadcrumbs we propose a 360 degrees interface that enables the user to find content around them by means of scanning the surrounding space with the mobile device. By using a loose metaphor of a periscope, harnessing the power of the smartphone sensors we designed an interactive interface capable of detecting content around the users and display it in the form of 2 dimensional bubbles which diameter depends on their distance from the users. Users will navigate the space in relation to the content that they are curious about, rather than in relation to the traditional geographical map. Through this model we envisage alleviating a certain cognitive overload generated by having to continuously confront a two dimensional map with the real three dimensional space surrounding the user, but also use the content as a navigational filter. Furthermore this alternative mean of navigating space might bring serendipitous discovery about places that user where not aware of or intending to reach. We hence conclude our thesis with the evaluation of the Breadcrumbs application and the comparison of the 360 degrees interface with a traditional 2 dimensional map displayed on the devise screen. Results from the evaluation are compiled in findings and insights for future use in designing and developing context aware mobile applications.
Resumo:
Organizations are Complex systems. A conceptual model of the enterprise is needed that is: coherent the distinguished aspect models constitute a logical and truly integral comprehensive all relevant issues are covered consistent the aspect models are free from contradictions or irregularities concise no superfluous matters are contained in it essential it shows only the essence of the enterprise, i.e., the model abstracts from all realization and implementation issues. The world is in great need for transparency about the operation of all the systems we daily work with, ranging from the domestic appliances to the big societal institutions. In this context the field of enterprise ontology has emerged with the aim to create models that help to understand the essence of the construction and operation of complete systems; more specifically, of enterprises. Enterprise ontology arises in the way to look through the distracting and confusing appearance of an enterprise right into its deep kernel. This, from the perspective of the system designer gives him the tools needed to design a successful system in a way that’s reflects the desires and needs of the workers of the enterprise. This project’s context is the use of DEMO (Design and Engineering Methodology for Organizations) for (re)designing or (re)engineering of an enterprise, namely a process of the construction department of a city hall, the lack of a well-founded theory about the construction and operation of this processes that was the motivation behind this work. The purpose of studying applying the DEMO theory and method was to optimize the process, automating it as much as possible, while reducing paper and time spent between tasks and provide a better service to the citizens.