902 resultados para physically based modeling
Resumo:
Polycondensation of 2,6-dihydroxynaphthalene with 4,4'-bis(4"-fluorobenzoyl)biphenyl affords a novel, semicrystalline poly(ether ketone) with a melting point of 406 degreesC and glass transition temperature (onset) of 168 degreesC. Molecular modeling and diffraction-simulation studies of this polymer, coupled with data from the single-crystal structure of an oligomer model, have enabled the crystal and molecular structure of the polymer to be determined from X-ray powder data. This structure-the first for any naphthalene-containing poly(ether ketone)-is fully ordered, in monoclinic space group P2(1)/b, with two chains per unit cell. Rietveld refinement against the experimental powder data gave a final agreement factor (R-wp) of 6.7%.
Resumo:
A model for the structure of amorphous molybdenum trisulfide, a-MoS3, has been created using reverse Monte Carlo methods. This model, which consists of chains Of MoS6 units sharing three sulfurs with each of its two neighbors and forming alternate long, nonbonded, and short, bonded, Mo-Mo separations, is a good fit to the neutron diffraction data and is chemically and physically realistic. The paper identifies the limitations of previous models based on Mo-3 triangular clusters in accounting for the available experimental data.
Resumo:
Given that the next and current generation networks will coexist for a considerable period of time, it is important to improve the performance of existing networks. One such improvement recently proposed is to enhance the throughput of ad hoc networks by using dual-hop relay-based transmission schemes. Since in ad hoc networks throughput is normally related to their energy consumption, it is important to examine the impact of using relay-based transmissions on energy consumption. In this paper, we present an analytical energy consumption model for dual-hop relay-based medium access control (MAC) protocols. Based on the recently reported relay-enabled Distributed Coordination Function (rDCF), we have shown the efficacy of the proposed analytical model. This is a generalized model and can be used to predict energy consumption in saturated relay-based ad hoc networks. This model can predict energy consumption in ideal environment and with transmission errors. It is shown that using a relay results in not only better throughput but also better energy efficiency. Copyright (C) 2009 Rizwan Ahmad et al.
Resumo:
We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.
Resumo:
This paper describes the design and implementation of an agent based network for the support of collaborative switching tasks within the control room environment of the National Grid Company plc. This work includes aspects from several research disciplines, including operational analysis, human computer interaction, finite state modelling techniques, intelligent agents and computer supported co-operative work. Aspects of these procedures have been used in the analysis of collaborative tasks to produce distributed local models for all involved users. These models have been used as the basis for the production of local finite state automata. These automata have then been embedded within an agent network together with behavioural information extracted from the task and user analysis phase. The resulting support system is capable of task and communication management within the transmission despatch environment.
Resumo:
In order to improve the quality of healthcare services, the integrated large-scale medical information system is needed to adapt to the changing medical environment. In this paper, we propose a requirement driven architecture of healthcare information system with hierarchical architecture. The system operates through the mapping mechanism between these layers and thus can organize functions dynamically adapting to user’s requirement. Furthermore, we introduce the organizational semiotics methods to capture and analyze user’s requirement through ontology chart and norms. Based on these results, the structure of user’s requirement pattern (URP) is established as the driven factor of our system. Our research makes a contribution to design architecture of healthcare system which can adapt to the changing medical environment.
Resumo:
The retrieval (estimation) of sea surface temperatures (SSTs) from space-based infrared observations is increasingly performed using retrieval coefficients derived from radiative transfer simulations of top-of-atmosphere brightness temperatures (BTs). Typically, an estimate of SST is formed from a weighted combination of BTs at a few wavelengths, plus an offset. This paper addresses two questions about the radiative transfer modeling approach to deriving these weighting and offset coefficients. How precisely specified do the coefficients need to be in order to obtain the required SST accuracy (e.g., scatter <0.3 K in week-average SST, bias <0.1 K)? And how precisely is it actually possible to specify them using current forward models? The conclusions are that weighting coefficients can be obtained with adequate precision, while the offset coefficient will often require an empirical adjustment of the order of a few tenths of a kelvin against validation data. Thus, a rational approach to defining retrieval coefficients is one of radiative transfer modeling followed by offset adjustment. The need for this approach is illustrated from experience in defining SST retrieval schemes for operational meteorological satellites. A strategy is described for obtaining the required offset adjustment, and the paper highlights some of the subtler aspects involved with reference to the example of SST retrievals from the imager on the geostationary satellite GOES-8.
Resumo:
This paper introduces and evaluates DryMOD, a dynamic water balance model of the key hydrological process in drylands that is based on free, public-domain datasets. The rainfall model of DryMOD makes optimal use of spatially disaggregated Tropical Rainfall Measuring Mission (TRMM) datasets to simulate hourly rainfall intensities at a spatial resolution of 1-km. Regional-scale applications of the model in seasonal catchments in Tunisia and Senegal characterize runoff and soil moisture distribution and dynamics in response to varying rainfall data inputs and soil properties. The results highlight the need for hourly-based rainfall simulation and for correcting TRMM 3B42 rainfall intensities for the fractional cover of rainfall (FCR). Without FCR correction and disaggregation to 1 km, TRMM 3B42 based rainfall intensities are too low to generate surface runoff and to induce substantial changes to soil moisture storage. The outcomes from the sensitivity analysis show that topsoil porosity is the most important soil property for simulation of runoff and soil moisture. Thus, we demonstrate the benefit of hydrological investigations at a scale, for which reliable information on soil profile characteristics exists and which is sufficiently fine to account for the heterogeneities of these. Where such information is available, application of DryMOD can assist in the spatial and temporal planning of water harvesting according to runoff-generating areas and the runoff ratio, as well as in the optimization of agricultural activities based on realistic representation of soil moisture conditions.
Resumo:
The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.
Resumo:
Inhibition of microtubule function is an attractive rational approach to anticancer therapy. Although taxanes are the most prominent among the microtubule-stabilizers, their clinical toxicity, poor pharmacokinetic properties, and resistance have stimulated the search for new antitumor agents having the same mechanism of action. Discodermolide is an example of nontaxane natural product that has the same mechanism of action, demonstrating superior antitumor efficacy and therapeutic index. The extraordinary chemical and biological properties have qualified discodermolide as a lead structure for the design of novel anticancer agents with optimized therapeutic properties. In the present work, we have employed a specialized fragment-based method to develop robust quantitative structure - activity relationship models for a series of synthetic discodermolide analogs. The generated molecular recognition patterns were combined with three-dimensional molecular modeling studies as a fundamental step on the path to understanding the molecular basis of drug-receptor interactions within this important series of potent antitumoral agents.
Resumo:
A major problem in e-service development is the prioritization of the requirements of different stakeholders. The main stakeholders are governments and their citizens, all of whom have different and sometimes conflicting requirements. In this paper, the prioritization problem is addressed by combining a value-based approach with an illustration technique. This paper examines the following research question: How can multiple stakeholder requirements be illustrated from a value-based perspective in order to be prioritizable? We used an e-service development case taken from a Swedish municipality to elaborate on our approach. Our contributions are: 1) a model of the relevant domains for requirement prioritization for government, citizens, technology, finances and laws and regulations; and 2) a requirement fulfillment analysis tool (RFA) that consists of a requirement-goal-value matrix (RGV), and a calculation and illustration module (CIM). The model reduces cognitive load, helps developers to focus on value fulfillment in e-service development and supports them in the formulation of requirements. It also offers an input to public policy makers, should they aim to target values in the design of e-services.
Resumo:
Esta dissertação estuda a propagação de crises sobre o sistema financeiro. Mais especi- ficamente, busca-se desenvolver modelos que permitam simular como um determinado choque econômico atinge determinados agentes do sistema financeiro e apartir dele se propagam, transformando-se em um problema sistêmico. A dissertação é dividida em dois capítulos,além da introdução. O primeiro capítulo desenvolve um modelo de propa- gação de crises em fundos de investimento baseado em ciência das redes.Combinando dois modelos de propagação em redes financeiras, um simulando a propagação de perdas em redes bipartites de ativos e agentes financeiros e o outro simulando a propagação de perdas em uma rede de investimentos diretos em quotas de outros agentes, desenvolve-se um algoritmo para simular a propagação de perdas através de ambos os mecanismos e utiliza-se este algoritmo para simular uma crise no mercado brasileiro de fundos de investimento. No capítulo 2,desenvolve-se um modelo de simulação baseado em agentes, com agentes financeiros, para simular propagação de um choque que afeta o mercado de operações compromissadas.Criamos também um mercado artificial composto por bancos, hedge funds e fundos de curto prazo e simulamos a propagação de um choque de liquidez sobre um ativo de risco securitizando utilizado para colateralizar operações compromissadas dos bancos.
Resumo:
In a world where organizations are ever more complex the need for the knowledge of the organizational self is a growing necessity. The DEMO methodology sets a goal in achieving the specification of the organizational self capturing the essence of the organization in way independent of its implementation and also coherent, consistent, complete, modular and objective. But having such organization self notion is of little meaning if this notion is not shared by the organization actors. To achieve this goal in a society that has grown attached to technology and where time is of utmost importance, using a tool such as a semantic Wikipedia may be the perfect way of making the information accessible. However, to establish DEMO methodology in such platform there is a need to create bridges between its modeling components and semantic Wikipedia. It’s in that aspect that our thesis focuses, trying to establish and implement, using a study case, the principles of a way of transforming the DEMO methodology diagrams in comprehensive pages on semantic Wikipedia but keeping them as abstract as possible to allow expansibility and generalization to all diagrams without losing any valuable information so that, if that is the wish, those diagrams may be recreated from the semantic pages and make this process a full cycle.
Resumo:
The purpose of this study was to identify whether activity modeling framework supports problem analysis and provides a traceable and tangible connection from the problem identification up to solution modeling. Methodology validation relied on a real problem from a Portuguese teaching syndicate (ASPE), regarding courses development and management. The study was carried out with a perspective to elaborate a complete tutorial of how to apply activity modeling framework to a real world problem. Within each step of activity modeling, we provided a summary elucidation of the relevant elements required to perform it, pointed out some improvements and applied it to ASPE’s real problem. It was found that activity modeling potentiates well structured problem analysis as well as provides a guiding thread between problem and solution modeling. It was concluded that activity-based task modeling is key to shorten the gap between problem and solution. The results revealed that the solution obtained using activity modeling framework solved the core concerns of our customer and allowed them to enhance the quality of their courses development and management. The principal conclusion was that activity modeling is a properly defined methodology that supports software engineers in problem analysis, keeping a traceable guide among problem and solution.
Resumo:
The paper presents a methodology to model three-dimensional reinforced concrete members by means of embedded discontinuity elements based on the Continuum Strong Discontinuous Approach (CSDA). Mixture theory concepts are used to model reinforced concrete as a 31) composite material constituted of concrete with long fibers (rebars) bundles oriented in different directions embedded in it. The effects of the rebars are modeled by phenomenological constitutive models devised to reproduce the axial non-linear behavior, as well as the bond-slip and dowel action. The paper presents the constitutive models assumed for the components and the compatibility conditions chosen to constitute the composite. Numerical analyses of existing experimental reinforced concrete members are presented, illustrating the applicability of the proposed methodology.