857 resultados para complexity of agents
Resumo:
Recently, unethical conduct in the workplace has been a focus of literature and media. Unethical pro-organizational behavior (UPB) refers to unethical conduct that employees engage in to benefit the organization. Given the complexity of UPB, there is an increasing need to understand how and under what conditions this attitude originates within organizations. Based on a sample of 167 employees and seven organizations, results support the moderated mediation model. An ethical leader increases employees’ organizational affective commitment which increases the likelihood to engage in UPB. However, the indirect relationship diminishes when employees feel authentic at work.
Resumo:
The work described in this thesis was performed at the Laboratory for Intense Lasers (L2I) of Instituto Superior Técnico, University of Lisbon (IST-UL). Its main contribution consists in the feasibility study of the broadband dispersive stages for an optical parametric chirped pulse amplifier based on the nonlinear crystal yttrium calcium oxi-borate (YCOB). In particular, the main goal of this work consisted in the characterization and implementation of the several optical devices involved in pulse expansion and compression of the amplified pulses to durations of the order of a few optical cycles (20 fs). This type of laser systems find application in fields such as medicine, telecommunications and machining, which require high energy, ultrashort (sub-100 fs) pulses. The main challenges consisted in the preliminary study of the performance of the broadband amplifier, which is essential for successfully handling pulses with bandwidths exceeding 100 nm when amplified from the μJ to 20 mJ per pulse. In general, the control, manipulation and characterization of optical phenomena on the scale of a few tens of fs and powers that can reach the PW level are extremely difficult and challenging due to the complexity of the phenomena of radiation-matter interaction and their nonlinearities, observed at this time scale and power level. For this purpose the main dispersive components were characterized in detail, specifically addressing the demonstration of pulse expansion and compression. The tested bandwidths are narrower than the final ones, in order to confirm the parameters of these elements and predict the performance for the broadband pulses. The work performed led to additional tasks such as a detailed characterization of laser oscillator seeding the laser chain and the detection and cancelling of additional sources of dispersion.
Resumo:
Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).
Resumo:
AbstractINTRODUCTION:Despite chemical and physical vector control strategies, persistent infestations of Triatoma sordida have been reported in a large part of Minas Gerais, Brazil, and the cause for this is little investigated. We aimed to characterize the deltamethrin toxicological profile in peridomestic T. sordidapopulations from Triângulo Mineiro area of Minas Gerais.METHODS:Susceptibility to deltamethrin was assessed in seventeen peridomestic T. sordida populations. Serial dilutions of deltamethrin in acetone (0.2µL) were topically applied on the first instar nymphs (F1; five days old; fasting weight, 1.2 ± 0.2mg). Dose response results were analyzed using Probit software, and the lethal doses, slope and resistance ratios were determined. Qualitative tests were also performed.RESULTS:The deltamethrin susceptibility profile of T. sordida populations revealed resistance ratios ranging from 0.84 to 2.8. The percentage mortality in response to a diagnostic dose was 100.0% in all populations.CONCLUSIONS:From our results, the lack of resistance to insecticides but persistent T. sordida infestations in the Triângulo Mineiro area may be because of: 1) environmental degradation facilitating dispersion of T. sordida , allowing colonization in artificial ecotopes; 2) operational failures; and 3) complexity of the peridomicile in the study area.These variables are being investigated.
Resumo:
Nowadays, data available and used by companies is growing very fast creating the need to use and manage this data in the most efficient way. To this end, data is replicated overmultiple datacenters and use different replication protocols, according to their needs, like more availability or stronger consistency level. The costs associated with full data replication can be very high, and most of the times, full replication is not needed since information can be logically partitioned. Another problem, is that by using datacenters to store and process information clients become heavily dependent on them. We propose a partial replication protocol called ParTree, which replicates data to clients, and organizes clients in a hierarchy, using communication between them to propagate information. This solution addresses some of these problems, namely by supporting partial data replication and offline execution mode. Given the complexity of the protocol, the use of formal verification is crucial to ensure the protocol two correctness properties: causal consistency and preservation of data. The use of TLA+ language and tools to formally specificity and verify the proposed protocol are also described.
Resumo:
The purpose of this work is to develop a practicable approach for Telecom firms to manage the credit risk exposition to their commercial agents’ network. Particularly it will try to approach the problem of credit concession to clients’ from a corporation perspective and explore the particular scenario of agents that are part of the commercial chain of the corporation and therefore are not end-users. The agents’ network that served as a model for the presented study is composed by companies that, at the same time, are both clients and suppliers of the Telecommunication Company. In that sense the credit exposition analysis must took into consideration all financial fluxes, both inbound and outbound. The current strain on the Financial Sector in Portugal, and other peripheral European economies, combined with the high leverage situation of most companies, generates an environment prone to credit default risk. Due to these circumstances managing credit risk exposure is becoming increasingly a critical function for every company Financial Department. The approach designed in the current study combined two traditional risk monitoring tools: credit risk scoring and credit limitation policies. The objective was to design a new credit monitoring framework that is more flexible, uses both external and internal relationship history to assess risk and takes into consideration commercial objectives inside the agents’ network. Although not explored at length, the blueprint of a Credit Governance model was created for implementing the new credit monitoring framework inside the telecom firm. The Telecom Company that served as a model for the present work decided to implement the new Credit Monitoring framework after this was presented to its Executive Commission.
Resumo:
Tese de Doutoramento - Programa Doutoral em Engenharia Industrial e Sistemas (PDEIS)
Resumo:
Previous study on the resistance of larvae of Sesarma curacaoense submitted to starvation has revealed a facultative lecithotrophy during zoeal stages, but megalopa and first juvenile stages are exclusively feeding stages. In the present study, the gross morphology and fine structure of the foregut of S. curacaoense were investigated during larval, megalopa and first juvenile stages. The foregut of the zoea I show specific setae and a filter press apparently functional. The foregut undergoes changes in the zoea II (last larval stage) with increment of setae number, mainly on the cardiopyloric valve and complexity of the filter press. After metamorphosis to megalopa stage the foregut become rather complex, with a gastric mill supporting a medial and two lateral teeth well-developed. The foregut of the first juvenile is more specialized compared to the previous stage, showing similar characteristics of the decapod adults. These results provide further evidence of facultative lecithotrophic development in the larvae of S. curacaoense.
Resumo:
Invasive aspergillosis (IA) is a life-threatening fungal disease commonly diagnosed among individuals with immunological deficits, namely hematological patients undergoing chemotherapy or allogeneic hematopoietic stem cell transplantation. Vaccines are not available, and despite the improved diagnosis and antifungal therapy, the treatment of IA is associated with a poor outcome. Importantly, the risk of infection and its clinical outcome vary significantly even among patients with similar predisposing clinical factors and microbiological exposure. Recent insights into antifungal immunity have further highlighted the complexity of host-fungus interactions and the multiple pathogen-sensing systems activated to control infection. How to decode this information into clinical practice remains however, a challenging issue in medical mycology. Here, we address recent advances in our understanding of the host-fungus interaction and discuss the application of this knowledge in potential strategies with the aim of moving toward personalized diagnostics and treatment (theranostics) in immunocompromised patients. Ultimately, the integration of individual traits into a clinically applicable process to predict the risk and progression of disease, and the efficacy of antifungal prophylaxis and therapy, holds the promise of a pioneering innovation benefiting patients at risk of IA.
Resumo:
Dissertação de mestrado em Engenharia Industrial
Resumo:
Radiometric changes observed in multi-temporal optical satellite images have an important role in efforts to characterize selective-logging areas. The aim of this study was to analyze the multi-temporal behavior of spectral-mixture responses in satellite images in simulated selective-logging areas in the Amazon forest, considering red/near-infrared spectral relationships. Forest edges were used to infer the selective-logging infrastructure using differently oriented edges in the transition between forest and deforested areas in satellite images. TM/Landsat-5 images acquired at three dates with different solar-illumination geometries were used in this analysis. The method assumed that the radiometric responses between forest with selective-logging effects and forest edges in contact with recent clear-cuts are related. The spatial frequency attributes of red/near infrared bands for edge areas were analyzed. Analysis of dispersion diagrams showed two groups of pixels that represent selective-logging areas. The attributes for size and radiometric distance representing these two groups were related to solar-elevation angle. The results suggest that detection of timber exploitation areas is limited because of the complexity of the selective-logging radiometric response. Thus, the accuracy of detecting selective logging can be influenced by the solar-elevation angle at the time of image acquisition. We conclude that images with lower solar-elevation angles are less reliable for delineation of selecting logging.
Resumo:
Pressures on the Brazilian Amazon forest have been accentuated by agricultural activities practiced by families encouraged to settle in this region in the 1970s by the colonization program of the government. The aims of this study were to analyze the temporal and spatial evolution of land cover and land use (LCLU) in the lower Tapajós region, in the state of Pará. We contrast 11 watersheds that are generally representative of the colonization dynamics in the region. For this purpose, Landsat satellite images from three different years, 1986, 2001, and 2009, were analyzed with Geographic Information Systems. Individual images were subject to an unsupervised classification using the Maximum Likelihood Classification algorithm available on GRASS. The classes retained for the representation of LCLU in this study were: (1) slightly altered old-growth forest, (2) succession forest, (3) crop land and pasture, and (4) bare soil. The analysis and observation of general trends in eleven watersheds shows that LCLU is changing very rapidly. The average deforestation of old-growth forest in all the watersheds was estimated at more than 30% for the period of 1986 to 2009. The local-scale analysis of watersheds reveals the complexity of LCLU, notably in relation to large changes in the temporal and spatial evolution of watersheds. Proximity to the sprawling city of Itaituba is related to the highest rate of deforestation in two watersheds. The opening of roads such as the Transamazonian highway is associated to the second highest rate of deforestation in three watersheds.
Resumo:
PURPOSE: The authors analyzed the 30-day and 6-month outcomes of 1,126 consecutive patients who underwent coronary stent implantation in 1996 and 1997. METHODS: The 30-day results and 6-month angiographic follow-up were analyzed in patients treated with coronary stents in 1996 and 1997. All patients underwent coronary stenting with high-pressure implantation (>12 atm) and antiplatelet drug regimen (aspirin plus ticlopidine). RESULTS: During the study period, 1,390 coronary stents were implanted in 1,200 vessels of 1,126 patients; 477 patients were treated in the year 1996 and 649 in 1997. The number of percutaneous procedures performed using stents increased significantly in 1997 compared to 1996 (64 % vs 48%, p=0.0001). The 30-day results were similar in both years; the success and stent thrombosis rates were equal (97% and 0.8%, respectively). The occurrence of new Q wave MI (1.3% vs 1.1%, 1996 vs 1997, p=NS), emergency coronary bypass surgery (1% vs 0.6%, 1996 vs 1997, p=NS) and 30-day death rates (0.2% vs 0.5%, 1996 vs 1997, p=NS) were similar. The 6-month restenosis rate was 25% in 1996 and 27% in 1997 (p= NS); the target vessel revascularization rate was 15% in 1996 and 16% in 1997 (p = NS). CONCLUSIONS: Intracoronary stenting showed a high success rate and a low incidence of 30-day occurrence of new major coronary events in both periods, despite the greater angiographic complexity of the patients treated with in 1997. These adverse variables did not have a negative influence at the 6-month clinical and angiographic follow-up, with similar rates of restenosis and ischemia-driven target lesion revascularization rates.
Resumo:
Fluorescence in situ hybridization (FISH) is based on the use of fluorescent staining dyes, however, the signal intensity of the images obtained by microscopy is seldom quantified with accuracy by the researcher. The development of innovative digital image processing programs and tools has been trying to overcome this problem, however, the determination of fluorescent intensity in microscopy images still has issues due to the lack of precision in the results and the complexity of existing software. This work presents FISHji, a set of new ImageJ methods for automated quantification of fluorescence in images obtained by epifluorescence microscopy. To validate the methods, results obtained by FISHji were compared with results obtained by flow cytometry. The mean correlation between FISHji and flow cytometry was high and significant, showing that the imaging methods are able to accurately assess the signal intensity of fluorescence images. FISHji are available for non-commercial use at http://paginas.fe.up.pt/nazevedo/.
Resumo:
OBJECTIVE: To analyze the predictive factors of complications after implantation of coronary stents in a consecutive cohort study. METHODS: Clinical and angiographic characteristics related to the procedure were analyzed, and the incidence of major cardiovascular complications (myocardial infarction, urgent surgery, new angioplasty, death) in the in-hospital phase were recorded. Data were stored in an Access database and analyzed by using the SPSS 6.0 statistical program and a stepwise backwards multiple logistic regression model. RESULTS: One thousand eighteen (mean age of 61±11 years, 29% females) patients underwent 1,070 stent implantations. The rate of angiographic success was 96.8%, the rate of clinical success was 91%, and the incidence of major cardiovascular complications was 7.9%. The variables independently associated with major cardiovascular complications, with their respective odds ratio (OR) were: rescue stent, OR = 5.1 (2.7-9.6); filamentary stent, OR = 4.5 (2.2-9.1); first-generation tubular stent, OR = 2.4 (1.2-4.6); multiple stents, OR = 3 (1.6-5.6); complexity of the lesion, OR = 2.4 (1.1-5.1); thrombus, OR = 2 (1.1-3.5). CONCLUSION: The results stress the importance of angiographic variables and techniques in the risk of complications and draw attention to the influence of the stent's design on the result of the procedure.