939 resultados para Individual-based modeling


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Canopy and aerodynamic conductances (gC and gA) are two of the key land surface biophysical variables that control the land surface response of land surface schemes in climate models. Their representation is crucial for predicting transpiration (?ET) and evaporation (?EE) flux components of the terrestrial latent heat flux (?E), which has important implications for global climate change and water resource management. By physical integration of radiometric surface temperature (TR) into an integrated framework of the Penman?Monteith and Shuttleworth?Wallace models, we present a novel approach to directly quantify the canopy-scale biophysical controls on ?ET and ?EE over multiple plant functional types (PFTs) in the Amazon Basin. Combining data from six LBA (Large-scale Biosphere-Atmosphere Experiment in Amazonia) eddy covariance tower sites and a TR-driven physically based modeling approach, we identified the canopy-scale feedback-response mechanism between gC, ?ET, and atmospheric vapor pressure deficit (DA), without using any leaf-scale empirical parameterizations for the modeling. The TR-based model shows minor biophysical control on ?ET during the wet (rainy) seasons where ?ET becomes predominantly radiation driven and net radiation (RN) determines 75 to 80?% of the variances of ?ET. However, biophysical control on ?ET is dramatically increased during the dry seasons, and particularly the 2005 drought year, explaining 50 to 65?% of the variances of ?ET, and indicates ?ET to be substantially soil moisture driven during the rainfall deficit phase. Despite substantial differences in gA between forests and pastures, very similar canopy?atmosphere "coupling" was found in these two biomes due to soil moisture-induced decrease in gC in the pasture. This revealed the pragmatic aspect of the TR-driven model behavior that exhibits a high sensitivity of gC to per unit change in wetness as opposed to gA that is marginally sensitive to surface wetness variability. Our results reveal the occurrence of a significant hysteresis between ?ET and gC during the dry season for the pasture sites, which is attributed to relatively low soil water availability as compared to the rainforests, likely due to differences in rooting depth between the two systems. Evaporation was significantly influenced by gA for all the PFTs and across all wetness conditions. Our analytical framework logically captures the responses of gC and gA to changes in atmospheric radiation, DA, and surface radiometric temperature, and thus appears to be promising for the improvement of existing land?surface?atmosphere exchange parameterizations across a range of spatial scales.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research work concerns the application of additive manufacturing (AM) technologies in new electric mobility sectors. The unmatched freedom that AM offers can potentially change the way electric motors are designed and manufactured. The thesis investigates the possibility of creating optimized electric machines that exploit AM technologies, with potential in various industrial sectors, including automotive and aerospace. In particular, we will evaluate how the design of electric motors can be improved by producing the rotor core using Laser Powder Bed Fusion (LPBF) and how the resulting design choices affect component performance. First, the metallurgical and soft magnetic properties of the pure iron and silicon iron alloy parts (Fe-3% wt.Si) produced by LPBF will be defined and discussed, considering the process parameters and the type of heat treatment. This research shows that using LPBF, both pure iron and iron silicon, the parts have mechanical and magnetic properties different from the laminated ones. Hence, FEM-based modeling will be employed to design the rotor core of an SYN RM machine to minimize torque ripple while maintaining structural integrity. Finally, we suggest that further research should extend the field of applicability to other electrical devices.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

PURPOSE: Few studies compare the variabilities that characterize environmental (EM) and biological monitoring (BM) data. Indeed, comparing their respective variabilities can help to identify the best strategy for evaluating occupational exposure. The objective of this study is to quantify the biological variability associated with 18 bio-indicators currently used in work environments. METHOD: Intra-individual (BV(intra)), inter-individual (BV(inter)), and total biological variability (BV(total)) were quantified using validated physiologically based toxicokinetic (PBTK) models coupled with Monte Carlo simulations. Two environmental exposure profiles with different levels of variability were considered (GSD of 1.5 and 2.0). RESULTS: PBTK models coupled with Monte Carlo simulations were successfully used to predict the biological variability of biological exposure indicators. The predicted values follow a lognormal distribution, characterized by GSD ranging from 1.1 to 2.3. Our results show that there is a link between biological variability and the half-life of bio-indicators, since BV(intra) and BV(total) both decrease as the biological indicator half-lives increase. BV(intra) is always lower than the variability in the air concentrations. On an individual basis, this means that the variability associated with the measurement of biological indicators is always lower than the variability characterizing airborne levels of contaminants. For a group of workers, BM is less variable than EM for bio-indicators with half-lives longer than 10-15 h. CONCLUSION: The variability data obtained in the present study can be useful in the development of BM strategies for exposure assessment and can be used to calculate the number of samples required for guiding industrial hygienists or medical doctors in decision-making.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Most psychophysical studies of object recognition have focussed on the recognition and representation of individual objects subjects had previously explicitely been trained on. Correspondingly, modeling studies have often employed a 'grandmother'-type representation where the objects to be recognized were represented by individual units. However, objects in the natural world are commonly members of a class containing a number of visually similar objects, such as faces, for which physiology studies have provided support for a representation based on a sparse population code, which permits generalization from the learned exemplars to novel objects of that class. In this paper, we present results from psychophysical and modeling studies intended to investigate object recognition in natural ('continuous') object classes. In two experiments, subjects were trained to perform subordinate level discrimination in a continuous object class - images of computer-rendered cars - created using a 3D morphing system. By comparing the recognition performance of trained and untrained subjects we could estimate the effects of viewpoint-specific training and infer properties of the object class-specific representation learned as a result of training. We then compared the experimental findings to simulations, building on our recently presented HMAX model of object recognition in cortex, to investigate the computational properties of a population-based object class representation as outlined above. We find experimental evidence, supported by modeling results, that training builds a viewpoint- and class-specific representation that supplements a pre-existing repre-sentation with lower shape discriminability but possibly greater viewpoint invariance.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Smooth flow of production in construction is hampered by disparity between individual trade teams' goals and the goals of stable production flow for the project as a whole. This is exacerbated by the difficulty of visualizing the flow of work in a construction project. While the addresses some of the issues in Building information modeling provides a powerful platform for visualizing work flow in control systems that also enable pull flow and deeper collaboration between teams on and off site. The requirements for implementation of a BIM-enabled pull flow construction management software system based on the Last Planner System™, called ‘KanBIM’, have been specified, and a set of functional mock-ups of the proposed system has been implemented and evaluated in a series of three focus group workshops. The requirements cover the areas of maintenance of work flow stability, enabling negotiation and commitment between teams, lean production planning with sophisticated pull flow control, and effective communication and visualization of flow. The evaluation results show that the system holds the potential to improve work flow and reduce waste by providing both process and product visualization at the work face.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Distributed energy and water balance models require time-series surfaces of the meteorological variables involved in hydrological processes. Most of the hydrological GIS-based models apply simple interpolation techniques to extrapolate the point scale values registered at weather stations at a watershed scale. In mountainous areas, where the monitoring network ineffectively covers the complex terrain heterogeneity, simple geostatistical methods for spatial interpolation are not always representative enough, and algorithms that explicitly or implicitly account for the features creating strong local gradients in the meteorological variables must be applied. Originally developed as a meteorological pre-processing tool for a complete hydrological model (WiMMed), MeteoMap has become an independent software. The individual interpolation algorithms used to approximate the spatial distribution of each meteorological variable were carefully selected taking into account both, the specific variable being mapped, and the common lack of input data from Mediterranean mountainous areas. They include corrections with height for both rainfall and temperature (Herrero et al., 2007), and topographic corrections for solar radiation (Aguilar et al., 2010). MeteoMap is a GIS-based freeware upon registration. Input data include weather station records and topographic data and the output consists of tables and maps of the meteorological variables at hourly, daily, predefined rainfall event duration or annual scales. It offers its own pre and post-processing tools, including video outlook, map printing and the possibility of exporting the maps to images or ASCII ArcGIS formats. This study presents the friendly user interface of the software and shows some case studies with applications to hydrological modeling.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Malaria is an important threat to travelers visiting endemic regions. The risk of acquiring malaria is complex and a number of factors including transmission intensity, duration of exposure, season of the year and use of chemoprophylaxis have to be taken into account estimating risk. Materials and methods: A mathematical model was developed to estimate the risk of non-immune individual acquiring falciparum malaria when traveling to the Amazon region of Brazil. The risk of malaria infection to travelers was calculated as a function of duration of exposure and season of arrival. Results: The results suggest significant variation of risk for non-immune travelers depending on arrival season, duration of the visit and transmission intensity. The calculated risk for visitors staying longer than 4 months during peak transmission was 0.5% per visit. Conclusions: Risk estimates based on mathematical modeling based on accurate data can be a valuable tool in assessing risk/benefits and cost/benefits when deciding on the value of interventions for travelers to malaria endemic regions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Southeastern Brazil has seen dramatic landscape modifications in recent decades, due to expansion of agriculture and urban areas; these changes have influenced the distribution and abundance of vertebrates. We developed predictive models of ecological and spatial distributions of capybaras (Hydrochoerus hydrochaeris) using ecological niche modeling. Most Occurrences of capybaras were in flat areas with water bodies Surrounded by sugarcane and pasture. More than 75% of the Piracicaba River basin was estimated as potentially habitable by capybara. The models had low omission error (2.3-3.4%), but higher commission error (91.0-98.5%); these ""model failures"" seem to be more related to local habitat characteristics than to spatial ones. The potential distribution of capybaras in the basin is associated with anthropogenic habitats, particularly with intensive land use for agriculture.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Distributed control systems consist of sensors, actuators and controllers, interconnected by communication networks and are characterized by a high number of concurrent process. This work presents a proposal for a procedure to model and analyze communication networks for distributed control systems in intelligent building. The approach considered for this purpose is based on the characterization of the control system as a discrete event system and application of coloured Petri net as a formal method for specification, analysis and verification of control solutions. With this approach, we develop the models that compose the communication networks for the control systems of intelligent building, which are considered the relationships between the various buildings systems. This procedure provides a structured development of models, facilitating the process of specifying the control algorithm. An application example is presented in order to illustrate the main features of this approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose is to present a scientific research that led to the modeling of an information system which aimed at the maintenance of traceability data in the Brazilian wine industry, according to the principles of a service-oriented architecture (SOA). Since 2005, traceability data maintenance is an obligation for all producers that intend to export to any European Union country. Also, final customers, including the Brazilian ones, have been asking for information about food products. A solution that collectively contemplated the industry was sought in order to permit that producer consortiums of associations could share the costs and benefits of such a solution. Following an extensive bibliographic review, a series of interviews conducted with Brazilian researchers and wine producers in Bento Goncalves - RS, Brazil, elucidated many aspects associated with the wine production process. Information technology issues related to the theme were also researched. The software was modeled with the Unified Modeling Language (UML) and uses web services for data exchange. A model for the wine production process was also proposed. A functional prototype showed that the adopted model is able to fulfill the demands of wine producers. The good results obtained lead us to consider the use of this model in other domains.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new, simple approach for modeling and assessing the operation and response of the multiline voltage-source controller (VSC)-based flexible ac transmission system controllers, namely the generalized interline power-flow controller (GIPFC) and the interline power-flow controller (IPFC), is presented in this paper. The model and the analysis developed are based on the converters` power balance method which makes use of the d-q orthogonal coordinates to thereafter present a direct solution for these controllers through a quadratic equation. The main constraints and limitations that such devices present while controlling the two independent ac systems considered, will also be evaluated. In order to examine and validate the steady-state model initially proposed, a phase-shift VSC-based GIPFC was also built in the Alternate Transients Program program whose results are also included in this paper. Where applicable, a comparative evaluation between the GIPFC and the IPFC is also presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper presents the development of a mechanical actuator using a shape memory alloy with a cooling system based on the thermoelectric effect (Seebeck-Peltier effect). Such a method has the advantage of reduced weight and requires a simpler control strategy as compared to other forced cooling systems. A complete mathematical model of the actuator was derived, and an experimental prototype was implemented. Several experiments are used to validate the model and to identify all parameters. A robust and nonlinear controller, based on sliding-mode theory, was derived and implemented. Experiments were used to evaluate the actuator closed-loop performance, stability, and robustness properties. The results showed that the proposed cooling system and controller are able to improve the dynamic response of the actuator. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.