857 resultados para complexity of agents


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Increasing costs of health care, fuelled by demand for high quality, cost-effective healthcare has drove hospitals to streamline their patient care delivery systems. One such systematic approach is the adaptation of Clinical Pathways (CP) as a tool to increase the quality of healthcare delivery. However, most organizations still rely on are paper-based pathway guidelines or specifications, which have limitations in process management and as a result can influence patient safety outcomes. In this paper, we present a method for generating clinical pathways based on organizational semiotics by capturing knowledge from syntactic, semantic and pragmatic to social level. Design/methodology/approach: The proposed modeling approach to generation of CPs adopts organizational semiotics and enables the generation of semantically rich representation of CP knowledge. Semantic Analysis Method (SAM) is applied to explicitly represent the semantics of the concepts, their relationships and patterns of behavior in terms of an ontology chart. Norm Analysis Method (NAM) is adopted to identify and formally specify patterns of behavior and rules that govern the actions identified on the ontology chart. Information collected during semantic and norm analysis is integrated to guide the generation of CPs using best practice represented in BPMN thus enabling the automation of CP. Findings: This research confirms the necessity of taking into consideration social aspects in designing information systems and automating CP. The complexity of healthcare processes can be best tackled by analyzing stakeholders, which we treat as social agents, their goals and patterns of action within the agent network. Originality/value: The current modeling methods describe CPs from a structural aspect comprising activities, properties and interrelationships. However, these methods lack a mechanism to describe possible patterns of human behavior and the conditions under which the behavior will occur. To overcome this weakness, a semiotic approach to generation of clinical pathway is introduced. The CP generated from SAM together with norms will enrich the knowledge representation of the domain through ontology modeling, which allows the recognition of human responsibilities and obligations and more importantly, the ultimate power of decision making in exceptional circumstances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Black carbon aerosol plays a unique and important role in Earth’s climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr�-1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m�-2 with 90% uncertainty bounds of (+0.08, +1.27)Wm�-2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m�-2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m�-2 with 90% uncertainty bounds of +0.17 to +2.1 W m�-2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m�-2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (�-0.50 to +1.08) W m-�2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (�-0.06 W m�-2 with 90% uncertainty bounds of �-1.45 to +1.29 W m�-2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study opinion dynamics in a population of interacting adaptive agents voting on a set of issues represented by vectors. We consider agents who can classify issues into one of two categories and can arrive at their opinions using an adaptive algorithm. Adaptation comes from learning and the information for the learning process comes from interacting with other neighboring agents and trying to change the internal state in order to concur with their opinions. The change in the internal state is driven by the information contained in the issue and in the opinion of the other agent. We present results in a simple yet rich context where each agent uses a Boolean perceptron to state their opinion. If the update occurs with information asynchronously exchanged among pairs of agents, then the typical case, if the number of issues is kept small, is the evolution into a society torn by the emergence of factions with extreme opposite beliefs. This occurs even when seeking consensus with agents with opposite opinions. If the number of issues is large, the dynamics becomes trapped, the society does not evolve into factions and a distribution of moderate opinions is observed. The synchronous case is technically simpler and is studied by formulating the problem in terms of differential equations that describe the evolution of order parameters that measure the consensus between pairs of agents. We show that for a large number of issues and unidirectional information flow, global consensus is a fixed point; however, the approach to this consensus is glassy for large societies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a stochastic heterogeneous interacting-agent model for the short-time non-equilibrium evolution of excess demand and price in a stylized asset market. We consider a combination of social interaction within peer groups and individually heterogeneous fundamentalist trading decisions which take into account the market price and the perceived fundamental value of the asset. The resulting excess demand is coupled to the market price. Rigorous analysis reveals that this feedback may lead to price oscillations, a single bounce, or monotonic price behaviour. The model is a rare example of an analytically tractable interacting-agent model which allows LIS to deduce in detail the origin of these different collective patterns. For a natural choice of initial distribution, the results are independent of the graph structure that models the peer network of agents whose decisions influence each other. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid growth of urban areas has a significant impact on traffic and transportation systems. New management policies and planning strategies are clearly necessary to cope with the more than ever limited capacity of existing road networks. The concept of Intelligent Transportation System (ITS) arises in this scenario; rather than attempting to increase road capacity by means of physical modifications to the infrastructure, the premise of ITS relies on the use of advanced communication and computer technologies to handle today’s traffic and transportation facilities. Influencing users’ behaviour patterns is a challenge that has stimulated much research in the ITS field, where human factors start gaining great importance to modelling, simulating, and assessing such an innovative approach. This work is aimed at using Multi-agent Systems (MAS) to represent the traffic and transportation systems in the light of the new performance measures brought about by ITS technologies. Agent features have good potentialities to represent those components of a system that are geographically and functionally distributed, such as most components in traffic and transportation. A BDI (beliefs, desires, and intentions) architecture is presented as an alternative to traditional models used to represent the driver behaviour within microscopic simulation allowing for an explicit representation of users’ mental states. Basic concepts of ITS and MAS are presented, as well as some application examples related to the subject. This has motivated the extension of an existing microscopic simulation framework to incorporate MAS features to enhance the representation of drivers. This way demand is generated from a population of agents as the result of their decisions on route and departure time, on a daily basis. The extended simulation model that now supports the interaction of BDI driver agents was effectively implemented, and different experiments were performed to test this approach in commuter scenarios. MAS provides a process-driven approach that fosters the easy construction of modular, robust, and scalable models, characteristics that lack in former result-driven approaches. Its abstraction premises allow for a closer association between the model and its practical implementation. Uncertainty and variability are addressed in a straightforward manner, as an easier representation of humanlike behaviours within the driver structure is provided by cognitive architectures, such as the BDI approach used in this work. This way MAS extends microscopic simulation of traffic to better address the complexity inherent in ITS technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This doctoral dissertation analyzes two novels by the American novelist Robert Coover as examples of hypertextual writing on the book bound page, as tokens of hyperfiction. The complexity displayed in the novels, John's Wife and The Adventures of Lucky Pierre, integrates the cultural elements that characterize the contemporary condition of capitalism and technologized practices that have fostered a different subjectivity evidenced in hypertextual writing and reading, the posthuman subjectivity. The models that account for the complexity of each novel are drawn from the concept of strange attractors in Chaos Theory and from the concept of rhizome in Nomadology. The transformations the characters undergo in the degree of their corporeality sets the plane on which to discuss turbulence and posthumanity. The notions of dynamic patterns and strange attractors, along with the concept of the Body without Organs and Rhizome are interpreted, leading to the revision of narratology and to analytical categories appropriate to the study of the novels. The reading exercised throughout this dissertation enacts Daniel Punday's corporeal reading. The changes in the characters' degree of materiality are associated with the stages of order, turbulence and chaos in the story, bearing on the constitution of subjectivity within and along the reading process. Coover's inscription of planes of consistency to counter linearity and accommodate hypertextual features to the paper supported narratives describes the characters' trajectory as rhizomatic. The study led to the conclusion that narrative today stands more as a regime in a rhizomatic relation with other regimes in cultural practice than as an exclusively literary form and genre. Besides this, posthuman subjectivity emerges as class identity, holding hypertextual novels as their literary form of choice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Develop software is still a risky business. After 60 years of experience, this community is still not able to consistently build Information Systems (IS) for organizations with predictable quality, within previously agreed budget and time constraints. Although software is changeable we are still unable to cope with the amount and complexity of change that organizations demand for their IS. To improve results, developers followed two alternatives: Frameworks that increase productivity but constrain the flexibility of possible solutions; Agile ways of developing software that keep flexibility with less upfront commitments. With strict frameworks, specific hacks have to be put in place to get around the framework construction options. In time this leads to inconsistent architectures that are harder to maintain due to incomplete documentation and human resources turnover. The main goals of this work is to create a new way to develop flexible IS for organizations, using web technologies, in a faster, better and cheaper way that is more suited to handle organizational change. To do so we propose an adaptive object model that uses a new ontology for data and action with strict normalizing rules. These rules should bound the effects of changes that can be better tested and therefore corrected. Interfaces are built with templates of resources that can be reused and extended in a flexible way. The “state of the world” for each IS is determined by all production and coordination acts that agents performed over time, even those performed by external systems. When bugs are found during maintenance, their past cascading effects can be checked through simulation, re-running the log of transaction acts over time and checking results with previous records. This work implements a prototype with part of the proposed system in order to have a preliminary assessment its feasibility and limitations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dispute settlement mechanisms help to create a fairly predictable and accurate environment in which economic agents can pursue their activities in the international arena. The World Trade Organization (WTO) Dispute Settlement Body (DSB) has now been in operation for 10 years and it is fitting, at this point to assess the progress achieved by Latin America and the Caribbean, the region that made most use of this mechanism during the period, and whose countries have made significant gains against protectionism in key export sectors. These successes constitute important precedents which will influence upcoming multilateral negotiations and future trade disputes.This article reviews the work carried out by the DSB, the role of the leading stakeholders in the system (the United States and the European Union) and progress made by countries of the region in a global context marked by the complexity of trade issues and the legal framework that regulates them. The findings presented in this article are based on the study "Una década de funcionamiento del Sistema de Solución de Diferencias de la OMC: avances y desafíos".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Infectious diarrhea can be caused by bacteria, viruses, or protozoan organisms, or a combination of these. The identification of co-infections in dogs is important to determine the prognosis and to plan strategies for their treatment and prophylaxis. Although many pathogens have been individually detected with real-time polymerase chain reaction (PCR), a comprehensive panel of agents that cause diarrhea in privately owned dogs has not yet been established. The objective of this study was to use a real-time PCR diarrhea panel to survey the frequencies of pathogens and co-infections in owned dogs attended in a veterinary hospital with and without diarrhea, as well the frequency in different countries. Feces samples were tested for canine distemper virus, canine coronavirus, canine parvovirus type 2 (CPV-2), Clostridium perfringens alpha toxin (CPA), Cryptosporidium spp., Giardia spp., and Salmonella spp. using molecular techniques.Results: In total, 104 diarrheic and 43 control dogs that were presented consecutively at a major private veterinary hospital were included in the study. Overall, 71/104 (68.3%) dogs with diarrhea were positive for at least one pathogen: a single infection in 39/71 dogs (54.9%) and co-infections in 32/71 dogs (45.1%), including 21/32 dogs (65.6%) with dual, 5/32 (15.6%) with triple, and 6/32 (18.8%) with quadruple infections. In the control group, 13/43 (30.2%) dogs were positive, all with single infections only. The most prevalent pathogens in the diarrheic dogs were CPA (40/104 dogs, 38.5%), CPV-2 (36/104 dogs, 34.6%), and Giardia spp. (14/104 dogs, 13.5%). CPV-2 was the most prevalent pathogen in the dual co-infections, associated with CPA, Cryptosporidium spp., or Giardia spp. No statistical difference (P = 0.8374) was observed in the duration of diarrhea or the number of deaths (P = 0.5722) in the presence or absence of single or co-infections.Conclusions: Diarrheic dogs showed a higher prevalence of pathogen infections than the controls. Whereas the healthy dogs had only single infections, about half the diarrheic dogs had co-infections. Therefore, multiple pathogens should be investigated in dogs presenting with diarrhea. The effects of multiple pathogens on the disease outcomes remain unclear because the rate of death and the duration of diarrhea did not seem to be affected by these factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is the first report on the parasitoid Palmistichus elaeisis, genus Eulophidae, found in the field parasitizing pupae of defoliating eucalyptus. Lepidopterous pests occur in eucalyptus plantations in Brazil, reaching high population levels. Due to the complexity of pest control in eucalyptus forests, alternative control methods have been proposed, for instance biological control through use of parasitoids. Natural enemies play an important role in regulating host populations because their larvae feed on the eggs, larvae, pupae or adults of other insects. The parasitic Hymenoptera are important agents in biological control programs against forest pests, and may provide economic and environmental benefits. The generalist endoparasitoid Palmistichus elaeisis Delvare and LaSalle, 1993 (Hymenoptera: Eulophidae) can develop in its host’s pupae, which overcome the host’s physiology and can therefore be used for biological control of agricultural and forest pests. This study aimed to evaluate the impact of P. elaeisis as a pupal parasitoid of S. violascens in providing a potential alternative to chemical control of the pest and creation of an alternative host. The experiment was developed in the Laboratory for Biological Control of Forest Pests, Universidade Estadual Paulista "Julio Mesquita Filho”. Parasitoids used in this test were originally collected on pupae of E. eucerus. (Lepidoptera: Riodinidae) in eucalyptus plantations at Lençois Paulistas, São Paulo state, Brazil, in 2011. Thereafter, a laboratory culture has been maintained, using pupae of Spodoptera frugiperda (J. E. Smith) (Lep.: Noctuidae) as hosts. S. violascens eggs were collected in a eucalyptus clonal plantation in Sao Paulo state (Brazil). Larvae were reared under ambient conditions on Eucalyptus urophylla S.T. Blake (Myrtaceae) leaves. The following parameters were determined: parasitism level, numbers of emerged and non-emerged parasitoids and duration of egg-adult cycle. The S. violascens pupae were dissected to evaluate the non-emerged parasitoids. The parasitism level reached 100%, with a 100% emergence rate. It was verified that 113.2±0.8 parasitoids emerged per individual pupa versus only 0.7±0.1 that did not emerge. The P. elaeisis egg-adult cycle was 20.3±0.6 days in S. violascens pupae. This opens new perspectives for utilizing this parasitoid in biological control programs against caterpillars important to forestry. Sarsina violascens in Brazil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the functional reliability and the complexity of reconfigurable antennas using graph models. The correlation between complexity and reliability for any given reconfigurable antenna is defined. Two methods are proposed to reduce failures and improve the reliability of reconfigurable antennas. The failures are caused by the reconfiguration technique or by the surrounding environment. These failure reduction methods proposed are tested and examples are given which verify these methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chemical agents used in cancer therapy are associated with cell cycle arrest, activation or deactivation of mechanisms associated to DNA repair and apoptosis. However, due to the complexity of biological systems, the molecular mechanisms responsible for these activities are not fully understood. Thus, studies about gene and protein expression have shown promising results for understanding the mechanisms related to cellular responses and regression of cancer after chemotherapy. This study aimed to evaluate the gene and protein expression profiling in bladder transitional cell carcinoma (TCC) with different TP53 status after gemcitabine (1.56 μM) treatment. The RT4 (grade 1, TP53 wild type), 5637 (grade 2, TP53 mutated) and T24 (grade 3, TP53 mutated) cell lines were used. PCR arrays and mass spectrometry were used to analyze gene and protein expression, respectively. Morphological alterations were observed using scanning electron microscopy (SEM) and transmission electron microscopy (TEM). The results of PCR array showed that gemcitabine activity was mainly related to CDKN1A, GADD45A and SERTDA1 overexpression, and BAX overexpression only in the wild type TP53 cells. Mass spectrometry demonstrated that gemcitabine modulated the protein expression, especially those from genes related to apoptosis, transport of vesicles and stress response. Analyses using SEM and TEM showed changes in cell morphology independently on the cell line studied. The observed decreased number of microvillus suggests low contact among the cells and between cell and extracellular matrix; irregular forms might indicate actin cytoskeleton deregulation; and the reduction in the amount of organelles and core size might indicate reduced cellular metabolism. In conclusion, independently on TP53 status or grade of bladder tumor, gemcitabine modulated genes related to the cell cycle and apoptosis, that reflected in morphological changes indicative of future cell death.