857 resultados para complexity of agents
Resumo:
An operational complexity model (OCM) is proposed to enable the complexity of both the cognitive and the computational components of a process to be determined. From the complexity of formation of a set of traces via a specified route a measure of the probability of that route can be determined. By determining the complexities of alternative routes leading to the formation of the same set of traces, the odds ratio indicating the relative plausibility of the alternative routes can be found. An illustrative application to a BitTorrent piracy case is presented, and the results obtained suggest that the OCM is capable of providing a realistic estimate of the odds ratio for two competing hypotheses. It is also demonstrated that the OCM can be straightforwardly refined to encompass a variety of circumstances.
Resumo:
I begin by citing a definition of "third wave" from the glossary in Turbo Chicks: Talking Young Feminisms at length because it communicates several key issues that I develop in this project. The definition introduces a tension within "third wave" feminism of building and differentiating itself from second wave feminism, the newness of the term "third wave," its association with "young" women, complexity of contemporary feminisms, and attention to multiple identities and oppressions. Uncovering explanations of "third wave" feminism that go beyond, like this one, generational associations, is not an easy task. Authors consistently group new feminist voices together by age under the label "third wave" feminists without questioning the accuracy of the designation. Most explorations of "third wave" feminism overlook the complexities and distinctions that abound among "young" feminists ; not all young feminists espouse similar ideas, tactics, and actions; and for various reasons, not all young feminists identify with a "third wave" of feminism. Less than a year after I began to learn about feminism I discovered Barbara Findlen's Listen Up: Voices From the Next Feminist Generation. Although the collection nor its contributors declare association with "third wave" feminism, consequent reviews and citations in articles identify it, along with Rebecca Walker's To Be Real: Telling the Truth and Changing the Voice of Feminism, as a major text of "third wave" feminism. Re-reading Listen Up since beginning to research "third wave" feminism, I now understand its fundamental influence on my research questions as a starting point for assessing persistent exclusion in contemporary feminism, rather than as a revolutionary text (as it is claimed to be in many reviews). Findlen begins the introduction with the bold claim, "My feminism wasn't shaped by antiwar or civil rights activism ..." (xi). Framing the collection with a disavowal of the influence women of color's organizational efforts negates, for me, the project's proclaimed commitment to multivocality. Though several contributions examine persistent exclusion within contemporary feminist movement, the larger project seems to rely on these essays to reflect this commitment, suggesting that Listen Up does not go beyond the "add and stir" approach to "diversity." Interestingly, this statement does not appear in the new edition of Listen Up published in 2001. And the content has changed with this new edition, including several more Latina contributors and other "corrective" additions.
Resumo:
In the past few years, libraries have started to design public programs that educate patrons about different tools and techniques to protect personal privacy. But do end user solutions provide adequate safeguards against surveillance by corporate and government actors? What does a comprehensive plan for privacy entail in order that libraries live up to their privacy values? In this paper, the authors discuss the complexity of surveillance architecture that the library institution might confront when seeking to defend the privacy rights of patrons. This architecture consists of three main parts: physical or material aspects, logical characteristics, and social factors of information and communication flows in the library setting. For each category, the authors will present short case studies that are culled from practitioner experience, research, and public discourse. The case studies probe the challenges faced by the library—not only when making hardware and software choices, but also choices related to staffing and program design. The paper shows that privacy choices intersect not only with free speech and chilling effects, but also with questions that concern intellectual property, organizational development, civic engagement, technological innovation, public infrastructure, and more. The paper ends with discussion of what libraries will require in order to sustain and improve efforts to serve as stewards of privacy in the 21st century.
Resumo:
Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.
Resumo:
In this research the 3DVAR data assimilation scheme is implemented in the numerical model DIVAST in order to optimize the performance of the numerical model by selecting an appropriate turbulence scheme and tuning its parameters. Two turbulence closure schemes: the Prandtl mixing length model and the two-equation k-ε model were incorporated into DIVAST and examined with respect to their universality of application, complexity of solutions, computational efficiency and numerical stability. A square harbour with one symmetrical entrance subject to tide-induced flows was selected to investigate the structure of turbulent flows. The experimental part of the research was conducted in a tidal basin. A significant advantage of such laboratory experiment is a fully controlled environment where domain setup and forcing are user-defined. The research shows that the Prandtl mixing length model and the two-equation k-ε model, with default parameterization predefined according to literature recommendations, overestimate eddy viscosity which in turn results in a significant underestimation of velocity magnitudes in the harbour. The data assimilation of the model-predicted velocity and laboratory observations significantly improves model predictions for both turbulence models by adjusting modelled flows in the harbour to match de-errored observations. 3DVAR allows also to identify and quantify shortcomings of the numerical model. Such comprehensive analysis gives an optimal solution based on which numerical model parameters can be estimated. The process of turbulence model optimization by reparameterization and tuning towards optimal state led to new constants that may be potentially applied to complex turbulent flows, such as rapidly developing flows or recirculating flows.
Resumo:
The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.
Resumo:
We consider exchange economies with a continuum of agents and differential information about finitely many states of nature. It was proved in Einy, Moreno and Shitovitz (2001) that if we allow for free disposal in the market clearing (feasibility) constraints then an irreducible economy has a competitive (or Walrasian expectations) equilibrium, and moreover, the set of competitive equilibrium allocations coincides with the private core. However when feasibility is defined with free disposal, competitive equilibrium allocations may not be incentive compatible and contracts may not be enforceable (see e.g. Glycopantis, Muir and Yannelis (2002)). This is the main motivation for considering equilibrium solutions with exact feasibility. We first prove that the results in Einy et al. (2001) are still valid without free-disposal. Then we define an incentive compatibility property motivated by the issue of contracts’ execution and we prove that every Pareto optimal exact feasible allocation is incentive compatible, implying that contracts of a competitive or core allocations are enforceable.
Resumo:
We extend the macroeconomic literature on Sstype rules by introducing infrequent information in a kinked ad justment cost model. We first show that optimal individual decision rules are both state-and -time dependent. We then develop an aggregation framework to study the macroeconomic implications of such optimal individual decision rules. In our model, a vast number of agents act together, and more so when uncertainty is large.The average effect of an aggregate shock is inversely related to its size and to aggregate uncertainty. These results are in contrast with those obtained with full information ad justment cost models.
Resumo:
Esta tese busca analisar a atuação de burocracias na implementação de políticas públicas em um ambiente de múltiplios principals, stakeholders e agentes, por meio de um estudo de caso sobre a regulação federal de agrotóxicos, atribuída a três órgãos distintos – MAPA, ANVISA e IBAMA. O referencial teórico foi construído a partir das teorias de controle político da burocracia, teorias de fontes do poder burocrático e da literatura sobre implementação de políticas públicas. O formato da legislação e o nível de complexidade da política dão aos órgãos atribuições exclusivas e inúmeros espaços de autonomia, ao mesmo tempo em que lhes obriga a decidir de forma consensuada. As burocracias adotam diversas estratégias para minimizar a assimetria de informação e o risco moral por parte do setor regulado. Os principals políticos se valem de diversos instrumentos para impor suas preferências, mas o fazem de forma superficial ou esporádica. A baixa efetividade desta influência é explicada mais pelas limitações dos principals do que pela resistência dos agentes. Poder Judiciário e Ministério Público podem ser importantes parceiros ou pontos de veto à ação regulatória dos órgãos. O estilo de liderança dos gestores e a visão sobre qual deve ser o papel da burocracia em uma política regulatória explicam as diferenças observadas nos órgãos no tocante à busca de alianças e à ação estratégica perante os demais atores.
Resumo:
The paper provides an alternative model for insurance market with three types of agents: households, providers of a service and insurance companies. Households have uncertainty about future leveIs of income. Providers, if hired by a household, perform a diagnoses and privately learn a signal. For each signal there is a procedure that maximizes the likelihood of the household obtaining the good state of nature. The paper assumes that providers care about their income and also about the likelihood households will obtain the good state of nature (sympathy assumption). This assumption is satisfied if, for example, they care about their reputation or if there are possible litigation costs in case they do not use the appropriate procedure. Finally, insurance companies offer contracts to both providers and households. The paper provides sufficient conditions for the existence of equilibrium and shows that the sympathy assumption 1eads to a 10ss of welfare for the households due to the need to incentive providers to choose the least expensive treatment.
Resumo:
The paper extends the cost of altruism model, analyzed in Lisboa (1999). There are three types of agents: households, providers of a service and insurance companies. Households have uncertainty about future leveIs of income. Providers, if hired by a household, have to choose a non-observable leveI of effort, perform a diagnoses and privately learn a signal. For each signal there is a procedure that maximizes the likelihood of the household obtaining the good state of nature. Finally, insurance companies offer contracts to both providers and households. The paper provides suflicient conditions for the existence of equilibrium and shows the optimal contract induces providers to care about their income and also about the likelihood households will obtain the good state of nature, which in Lisboa (1999) was stated as altruism assumption. Equilibrium is inefficient in comparison with the standard moral hazard outcome whenever high leveIs of effort is chosen precisely due to the need to incentive providers to choose the least expensive treatment for some signals. We show, however that an equilibrium is always constrained optimal.
Resumo:
Population ageing is a problem that countries will have to cope with within a few years. How would changes in the social security system affect individual behaviour? We develop a multi-sectoral life-cycle model with both retirement and occupational choices to evaluate what are the macroeconomic impacts of social security reforms. We calibrate the model to match 2011 Brazilian economy and perform a counterfactual exercise of the long-run impacts of a recently adopted reform. In 2013, the Brazilian government approximated the two segregated social security schemes, imposing a ceiling on public pensions. In the benchmark equilibrium, our modelling economy is able to reproduce the early retirement claiming, the agents' stationary distribution among sectors, as well as the social security deficit and the public job application decision. In the counterfactual exercise, we find a significant reduction of 55\% in the social security deficit, an increase of 1.94\% in capital-to-output ratio, with both output and capital growing, a delay in retirement claims of public workers and a modification in the structure of agents applying to the public sector job.
Resumo:
Que fatores influenciam a variedade de sequências de tarefas componentes de rotinas organizacionais? Este estudo está focado em analisar como fatores antecedentes das execuções influenciam a variedade sequencial de rotinas organizacionais. Rotinas organizacionais conferem eficiência e coordenação aos processos organizacionais por meio da padronização e especialização das tarefas e de seus encadeamentos. A literatura suscita que altos níveis de variabilidade podem ser importantes para manter a flexibilidade nos processos organizacionais (Feldman e Pentland, 2003). A variedade sequencial é tida como a expressão mais fidedigna da diversidade de configurações das sequências de tarefas componentes de uma rotina organizacional. Este estudo propõe uma metodologia qualitativa de análise das fontes de variedade sequencial. Utiliza-se o quadro de referência proposto em Becker (2005b) que contempla os antecedentes complexidade da tarefa, interdependência da tarefa, pressão de tempo, incerteza pertencente à tarefa e mudança de agentes além de características e resultados. Para atingir este objetivo foram empreendidas duas observações em prontos-socorros de organizações paulistanas. A rotina organizacional de atendimento a pacientes em prontos- socorros é um processo relevante de ser estudado pois é principal forma de acesso dos pacientes a tratamentos nos dois hospitais analisados. Além disso, a rotina se mostra bastante eficiente e é caracterizada por atender padrões internacionais de qualidade de processo. Os dados foram sistematizados por uma análise de conteúdo adaptada ao estudo da variedade sequencial. Graças à essa análise foi possível identificar as fontes de variedade sequencial e discuti-las no contexto da literatura de rotinas organizacionais, foram identificadas quatro fontes principais: definição de prioridade ligada à pressão de tempo; necessidade de especialistas ligada à complexidade da tarefa; incremento de informações para diagnóstico e tratamento ligada à incerteza da tarefa; e, prolongar o tratamento ligada à incerteza e interdependência da tarefa. Não há evidências que a mudança de agentes influencia a variedade sequencial. Este estudo propõe que os antecedentes constituem dois grupos: antecedentes externos derivam de questões relativas à multiplicidade de condições dos pacientes como pressão de tempo e incerteza da tarefa. Antecedentes internos estão ligados à regras e recursos organizacionais como complexidade e a interdependência da tarefa.
Resumo:
Logo após à crise financeira de 2007-08 o Federal Reserve interveio para tentar controlar a recessão. No entanto, ele não apenas baixou os juros, como também adotou políticas não-convencionais, incluindo o empréstimo direto para empresas em mercados de crédito de alto nível. Estas novas medidas foram controversas e alguns opositores protestaram porque elas estariam ajudando disproporcionalmente aquelas pessoas ligadas ao sistema financeiro que já eram ricas. Nós utilizamos um modelo DSGE para a análise de políticas monetária não convencional e introduzimos dois tipos distintos de agentes, capitalistas e trabalhadores, para investigar o seu impacto distributivo. Nós encontramos que a política de crédito to Fed foi bem sucedida no mercado de trabalho, o que ajuda mais os trabalhadores, e introduziu um novo competidor no mercado bancário, o governo, o que prejudica mais os capitalistas. Logo, nós encontramos que a política de crédito diminuiu a desigualdade nos EUA.
Resumo:
This manuscript describes the development and validation of an ultra-fast, efficient, and high throughput analytical method based on ultra-high performance liquid chromatography (UHPLC) equipped with a photodiode array (PDA) detection system, for the simultaneous analysis of fifteen bioactive metabolites: gallic acid, protocatechuic acid, (−)-catechin, gentisic acid, (−)-epicatechin, syringic acid, p-coumaric acid, ferulic acid, m-coumaric acid, rutin, trans-resveratrol, myricetin, quercetin, cinnamic acid and kaempferol, in wines. A 50-mm column packed with 1.7-μm particles operating at elevated pressure (UHPLC strategy) was selected to attain ultra-fast analysis and highly efficient separations. In order to reduce the complexity of wine extract and improve the recovery efficiency, a reverse-phase solid-phase extraction (SPE) procedure using as sorbent a new macroporous copolymer made from a balanced ratio of two monomers, the lipophilic divinylbenzene and the hydrophilic N-vinylpyrrolidone (Oasis™ HLB), was performed prior to UHPLC–PDA analysis. The calibration curves of bioactive metabolites showed good linearity within the established range. Limits of detection (LOD) and quantification (LOQ) ranged from 0.006 μg mL−1 to 0.58 μg mL−1, and from 0.019 μg mL−1 to 1.94 μg mL−1, for gallic and gentisic acids, respectively. The average recoveries ± SD for the three levels of concentration tested (n = 9) in red and white wines were, respectively, 89 ± 3% and 90 ± 2%. The repeatability expressed as relative standard deviation (RSD) was below 10% for all the metabolites assayed. The validated method was then applied to red and white wines from different geographical origins (Azores, Canary and Madeira Islands). The most abundant component in the analysed red wines was (−)-epicatechin followed by (−)-catechin and rutin, whereas in white wines syringic and p-coumaric acids were found the major phenolic metabolites. The method was completely validated, providing a sensitive analysis for bioactive phenolic metabolites detection and showing satisfactory data for all the parameters tested. Moreover, was revealed as an ultra-fast approach allowing the separation of the fifteen bioactive metabolites investigated with high resolution power within 5 min.