941 resultados para Network scale-up method
Resumo:
People go through their life making all kinds of decisions, and some of these decisions affect their demand for transportation, for example, their choices of where to live and where to work, how and when to travel and which route to take. Transport related choices are typically time dependent and characterized by large number of alternatives that can be spatially correlated. This thesis deals with models that can be used to analyze and predict discrete choices in large-scale networks. The proposed models and methods are highly relevant for, but not limited to, transport applications. We model decisions as sequences of choices within the dynamic discrete choice framework, also known as parametric Markov decision processes. Such models are known to be difficult to estimate and to apply to make predictions because dynamic programming problems need to be solved in order to compute choice probabilities. In this thesis we show that it is possible to explore the network structure and the flexibility of dynamic programming so that the dynamic discrete choice modeling approach is not only useful to model time dependent choices, but also makes it easier to model large-scale static choices. The thesis consists of seven articles containing a number of models and methods for estimating, applying and testing large-scale discrete choice models. In the following we group the contributions under three themes: route choice modeling, large-scale multivariate extreme value (MEV) model estimation and nonlinear optimization algorithms. Five articles are related to route choice modeling. We propose different dynamic discrete choice models that allow paths to be correlated based on the MEV and mixed logit models. The resulting route choice models become expensive to estimate and we deal with this challenge by proposing innovative methods that allow to reduce the estimation cost. For example, we propose a decomposition method that not only opens up for possibility of mixing, but also speeds up the estimation for simple logit models, which has implications also for traffic simulation. Moreover, we compare the utility maximization and regret minimization decision rules, and we propose a misspecification test for logit-based route choice models. The second theme is related to the estimation of static discrete choice models with large choice sets. We establish that a class of MEV models can be reformulated as dynamic discrete choice models on the networks of correlation structures. These dynamic models can then be estimated quickly using dynamic programming techniques and an efficient nonlinear optimization algorithm. Finally, the third theme focuses on structured quasi-Newton techniques for estimating discrete choice models by maximum likelihood. We examine and adapt switching methods that can be easily integrated into usual optimization algorithms (line search and trust region) to accelerate the estimation process. The proposed dynamic discrete choice models and estimation methods can be used in various discrete choice applications. In the area of big data analytics, models that can deal with large choice sets and sequential choices are important. Our research can therefore be of interest in various demand analysis applications (predictive analytics) or can be integrated with optimization models (prescriptive analytics). Furthermore, our studies indicate the potential of dynamic programming techniques in this context, even for static models, which opens up a variety of future research directions.
Resumo:
Complex networks have been characterised by their specific connectivity patterns (network motifs), but their building blocks can also be identified and described by node-motifs-a combination of local network features. One technique to identify single node-motifs has been presented by Costa et al. (L. D. F. Costa, F. A. Rodrigues, C. C. Hilgetag, and M. Kaiser, Europhys. Lett., 87, 1, 2009). Here, we first suggest improvements to the method including how its parameters can be determined automatically. Such automatic routines make high-throughput studies of many networks feasible. Second, the new routines are validated in different network-series. Third, we provide an example of how the method can be used to analyse network time-series. In conclusion, we provide a robust method for systematically discovering and classifying characteristic nodes of a network. In contrast to classical motif analysis, our approach can identify individual components (here: nodes) that are specific to a network. Such special nodes, as hubs before, might be found to play critical roles in real-world networks.
Resumo:
Base-level maps (or ""isobase maps"", as originally defined by Filosofov, 1960), express a relationship between valley order and topography. The base-level map can be seen as a ""simplified"" version of the original topographic surface, from which the ""noise"" of the low-order stream erosion was removed. This method is able to identify areas with possible tectonic influence even within lithologically uniform domains. Base-level maps have been recently applied in semi-detail scale (e.g., 1:50 000 or larger) morphotectonic analysis. In this paper, we present an evaluation of the method's applicability in regional-scale analysis (e.g., 1:250 000 or smaller). A test area was selected in northern Brazil, at the lower course of the Araguaia and Tocantins rivers. The drainage network extracted from SRTM30_PLUS DEMs with spatial resolution of approximately 900 m was visually compared with available topographic maps and considered to be compatible with a 1:1,000 000 scale. Regarding the interpretation of regional-scale morphostructures, the map constructed with 2nd and 3rd-order valleys was considered to present the best results. Some of the interpreted base-level anomalies correspond to important shear zones and geological contacts present in the 1:5 000 000 Geological Map of South America. Others have no correspondence with mapped Precambrian structures and are considered to represent younger, probably neotectonic, features. A strong E-W orientation of the base-level lines over the inflexion of the Araguaia and Tocantins rivers, suggest a major drainage capture. A N-S topographic swath profile over the Tocantins and Araguaia rivers reveals a topographic pattern which, allied with seismic data showing a roughly N-S direction of extension in the area, lead us to interpret this lineament as an E-W, southward-dipping normal fault. There is also a good visual correspondence between the base-level lineaments and geophysical anomalies. A NW-SE lineament in the southeast of the study area partially corresponds to the northern border of the Mosquito lava field, of Jurassic age, and a NW-SE lineament traced in the northeastern sector of the study area can be interpreted as the Picos-Santa Ines lineament, identifiable in geophysical maps but with little expression in hypsometric or topographic maps.
Resumo:
The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.
Resumo:
In this work we show that the dengue epidemic in the city of Singapore organized itself into a scale-free network of transmission as the 2000-2005 outbreaks progressed. This scale-free network of cluster comprised geographical breeding places for the aedes mosquitoes, acting as super-spreaders nodes in a network of transmission. The geographical organization of the network was analysed by the corresponding distribution of weekly number of new cases. Therefore, our hypothesis is that the distribution of dengue cases reflects the geographical organization of a transmission network, which evolved towards a power law as the epidemic intensity progressed until 2005. (c) 2007 Elsevier Inc. All rights reserved.
Resumo:
Previous Monte Carlo studies have investigated the multileaf collimator (MLC) contribution to the build-up region for fields in which the MLC leaves were fully blocking the openings defined by the collimation jaws. In the present work, we investigate the same effect but for symmetric and asymmetric MLC defined field sizes (2×2, 4×4, 10×10 and 3×7 cm2). A Varian 2100C/D accelerator with 120-leaf MLC is accurately modeled fora6MVphoton beam using the BEAMnrc/EGSnrc code. Our results indicate that particles scattered from accelerator head and MLC are responsible for the increase of about 7% on the surface dose when comparing 2×2 and 10×10 cm2 fields. We found that the MLC contribution to the total build-up dose is about 2% for the 2×2 cm2 field and less than 1% for the largest fields.
Resumo:
Ancillary services represent a good business opportunity that must be considered by market players. This paper presents a new methodology for ancillary services market dispatch. The method considers the bids submitted to the market and includes a market clearing mechanism based on deterministic optimization. An Artificial Neural Network is used for day-ahead prediction of Regulation Down, regulation-up, Spin Reserve and Non-Spin Reserve requirements. Two test cases based on California Independent System Operator data concerning dispatch of Regulation Down, Regulation Up, Spin Reserve and Non-Spin Reserve services are included in this paper to illustrate the application of the proposed method: (1) dispatch considering simple bids; (2) dispatch considering complex bids.
Resumo:
Most research work on WSNs has focused on protocols or on specific applications. There is a clear lack of easy/ready-to-use WSN technologies and tools for planning, implementing, testing and commissioning WSN systems in an integrated fashion. While there exists a plethora of papers about network planning and deployment methodologies, to the best of our knowledge none of them helps the designer to match coverage requirements with network performance evaluation. In this paper we aim at filling this gap by presenting an unified toolset, i.e., a framework able to provide a global picture of the system, from the network deployment planning to system test and validation. This toolset has been designed to back up the EMMON WSN system architecture for large-scale, dense, real-time embedded monitoring. It includes network deployment planning, worst-case analysis and dimensioning, protocol simulation and automatic remote programming and hardware testing tools. This toolset has been paramount to validate the system architecture through DEMMON1, the first EMMON demonstrator, i.e., a 300+ node test-bed, which is, to the best of our knowledge, the largest single-site WSN test-bed in Europe to date.
Resumo:
It is important to understand and forecast a typical or a particularly household daily consumption in order to design and size suitable renewable energy systems and energy storage. In this research for Short Term Load Forecasting (STLF) it has been used Artificial Neural Networks (ANN) and, despite the consumption unpredictability, it has been shown the possibility to forecast the electricity consumption of a household with certainty. The ANNs are recognized to be a potential methodology for modeling hourly and daily energy consumption and load forecasting. Input variables such as apartment area, numbers of occupants, electrical appliance consumption and Boolean inputs as hourly meter system were considered. Furthermore, the investigation carried out aims to define an ANN architecture and a training algorithm in order to achieve a robust model to be used in forecasting energy consumption in a typical household. It was observed that a feed-forward ANN and the Levenberg-Marquardt algorithm provided a good performance. For this research it was used a database with consumption records, logged in 93 real households, in Lisbon, Portugal, between February 2000 and July 2001, including both weekdays and weekend. The results show that the ANN approach provides a reliable model for forecasting household electric energy consumption and load profile. © 2014 The Author.
Resumo:
Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Energia
Resumo:
We examine the constraints on the two Higgs doublet model (2HDM) due to the stability of the scalar potential and absence of Landau poles at energy scales below the Planck scale. We employ the most general 2HDM that incorporates an approximately Standard Model (SM) Higgs boson with a flavor aligned Yukawa sector to eliminate potential tree-level Higgs-mediated flavor changing neutral currents. Using basis independent techniques, we exhibit robust regimes of the 2HDM parameter space with a 125 GeV SM-like Higgs boson that is stable and perturbative up to the Planck scale. Implications for the heavy scalar spectrum are exhibited.
Resumo:
Chagas disease is a chronic, tropical, parasitic disease, endemic throughout Latin America. The large-scale migration of populations has increased the geographic distribution of the disease and cases have been observed in many other countries around the world. To strengthen the critical mass of knowledge generated in different countries, it is essential to promote cooperative and translational research initiatives. We analyzed authorship of scientific documents on Chagas disease indexed in the Medline database from 1940 to 2009. Bibliometrics was used to analyze the evolution of collaboration patterns. A Social Network Analysis was carried out to identify the main research groups in the area by applying clustering methods. We then analyzed 13,989 papers produced by 21,350 authors. Collaboration among authors dramatically increased over the study period, reaching an average of 6.2 authors per paper in the last five-year period. Applying a threshold of collaboration of five or more papers signed in co-authorship, we identified 148 consolidated research groups made up of 1,750 authors. The Chagas disease network identified constitutes a "small world," characterized by a high degree of clustering and a notably high number of Brazilian researchers.
Resumo:
An efficient method for breeding Biomphalaria tenagophila (Taim lineage/RS) was developed over a 5-year-period (2005-2010). Special facilities were provided which consisted of four cement tanks (9.4 x 0.6 x 0.22 m), with their bottom covered with a layer of sterilized red earth and calcium carbonate. Standard measures were adopted, as follows: each tank should contain an average of 3000 specimens, and would be provided with a daily ration of 35,000 mg complemented with lettuce. A green-house effect heating system was developed which constituted of movable dark canvas covers, which allowed the temperature to be controlled between 20 - 24 ºC. This system was essential, especially during the coldest months of the year. Approximately 27,000 specimens with a diameter of 12 mm or more were produced during a 14-month-period. The mortality rates of the newly-hatched and adult snails were 77% and 37%, respectively. The follow-up of the development system related to 310 specimens of B. tenagophila demonstrated that 70-day-old snails reached an average of 17.0 ± 0.9 mm diameter. The mortality rates and the development performance of B. tenagophila snails can be considered as highly satisfactory, when compared with other results in literature related to works carried out with different species of the genus Biomphalaria, under controlled laboratory conditions.
Resumo:
As technology advances not only do new standards and programming styles appear but also some of the previously established ones gain relevance. In a new Internet paradigm where interconnection between small devices is key to the development of new businesses and scientific advancement there is the need to find simple solutions that anyone can implement in order to allow ideas to become more than that, ideas. Open-source software is still alive and well, especially in the area of the Internet of Things. This opens windows for many low capital entrepreneurs to experiment with their ideas and actually develop prototypes, which can help identify problems with a project or shine light on possible new features and interactions. As programming becomes more and more popular between people of fields not related to software there is the need for guidance in developing something other than basic algorithms, which is where this thesis comes in: A comprehensive document explaining the challenges and available choices of developing a sensor data and message delivery system, which scales well and implements the delivery of critical messages. Modularity and extensibility were also given much importance, making this an affordable tool for anyone that wants to build a sensor network of the kind.
Watershed-scale runoff routing and solute transport in a spatially aggregated hydrological framework
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies