977 resultados para Operation analysis
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
Background. Retail clinics, also called convenience care clinics, have become a rapidly growing trend since their initial development in 2000. These clinics are coupled within a larger retail operation and are generally located in "big-box" discount stores such as Wal-mart or Target, grocery stores such as Publix or H-E-B, or in retail pharmacies such as CVS or Walgreen's (Deloitte Center for Health Solutions, 2008). Care is typically provided by nurse practitioners. Research indicates that this new health care delivery system reduces cost, raises quality, and provides a means of access to the uninsured population (e.g., Deloitte Center for Health Solutions, 2008; Convenient Care Association, 2008a, 2008b, 2008c; Hansen-Turton, Miller, Nash, Ryan, Counts, 2007; Salinsky, 2009; Scott, 2006; Ahmed & Fincham, 2010). Some healthcare analysts even suggest that retail clinics offer a feasible solution to the shortage of primary care physicians facing the nation (AHRQ Health Care Innovations Exchange, 2010). ^ The development and performance of retail clinics is heavily dependent upon individual state policies regulating NPs. Texas currently has one of the most highly regulated practice environments for NPs (Stout & Elton, 2007; Hammonds, 2008). In September 2009, Texas passed Senate Bill 532 addressing the scope of practice of nurse practitioners in the convenience care model. In comparison to other states, this law still heavily regulates nurse practitioners. However, little research has been conducted to evaluate the impact of state laws regulating nurse practitioners on the development and performance of retail clinics. ^ Objectives. (1). To describe the potential impact that SB 532 has on retail clinic performance. (2). To discuss the effectiveness, efficiency, and equity of the convenience care model. (3). To describe possible alternatives to Texas' nurse practitioner scope of practice guidelines as delineated in Texas Senate Bill 532. (4). To describe the type of nurse practitioner state regulation (i.e. independent, light, moderate, or heavy) that best promotes the convenience care model. ^ Methods. State regulations governing nurse practitioners can be characterized as independent, light, moderate, and heavy. Four state NP regulatory types and retail clinic performance were compared and contrasted to that of Texas regulations using Dunn and Aday's theoretical models for conducting policy analysis and evaluating healthcare systems. Criteria for measurement included effectiveness, efficiency, and equity. Comparison states were Arizona (Independent), Minnesota (Light), Massachusetts (Moderate), and Florida (Heavy). ^ Results. A comparative states analysis of Texas SB 532 and alternative NP scope of practice guidelines among the four states: Arizona, Florida, Massachusetts, and Minnesota, indicated that SB 532 has minimal potential to affect the shortage of primary care providers in the state. Although SB 532 may increase the number of NPs a physician may supervise, NPs are still heavily restricted in their scope of practice and limited in their ability to act as primary care providers. Arizona's example of independent NP practice provided the best alternative to affect the shortage of PCPs in Texas as evidenced by a lower uninsured rate and less ED visits per 1,000 population. A survey of comparison states suggests that retail clinics thrive in states that more heavily restrict NP scope of practice as opposed to those that are more permissive, with the exception of Arizona. An analysis of effectiveness, efficiency, and equity of the convenience care model indicates that retail clinics perform well in the areas of effectiveness and efficiency; but, fall short in the area of equity. ^ Conclusion. Texas Senate 532 represents an incremental step towards addressing the problem of a shortage of PCPs in the state. A comparative policy analysis of the other four states with varying degrees of NP scope of practice indicate that a more aggressive policy allowing for independent NP practice will be needed to achieve positive changes in health outcomes. Retail clinics pose a temporary solution to the shortage of PCPs and will need to expand their locations to poorer regions and incorporate some chronic care to obtain measurable health outcomes. ^
Resumo:
Electronic waste is a fairly new and largely unknown phenomenon. Accordingly, governments have only recently acknowledged electronic waste as a threat to the environment and public health. In attempting to mitigate the hazards associated with this rapidly growing toxic waste stream, governments at all levels have started to implement e-waste management programs. The legislation enacted to create these programs is based on extended producer responsibility or EPR policy. ^ EPR shifts the burden of final disposal of e-waste from the consumer or municipal solid waste system to the manufacturer of electronic equipment. Applying an EPR policy is intended to send signals up the production chain to the manufacturer. The desired outcome is to change the methods of production in order to reduce production outputs/inputs with the ultimate goal of changing product design. This thesis performs a policy analysis of the current e-waste policies at the federal and state level of government, focusing specifically on Texas e-waste policies. ^ The Texas e-waste law known, as HB 2714 or the Texas Computer TakeBack Law, requires manufacturers to provide individual consumers with a free and convenient method for returning their used computers to manufacturers. The law is based on individual producer responsibility and shared responsibility among consumer, retailers, recyclers, and the TCEQ. ^ Using a set of evaluation criteria created by the Organization for Economic Co-operation and Development, the Texas e-waste law was examined to determine its effectiveness at reducing the threat of e-waste in Texas. Based on the outcomes of the analysis certain recommendations were made for the legislature to incorporate into HB 2714. ^ The results of the policy analysis show that HB 2714 is a poorly constructed law and does not provide the desired results seen in other states with EPR policies. The TakeBack Law does little to change the collection methods of manufacturers and even less to change their production habits. If the e-waste problem is to be taken seriously, HB 2714 must be amended to reflect the proposed changes in this thesis.^
Resumo:
The present dataset includes results of analysis of 227 zooplankton samples taken in and off the Sevastopol Bay in the Black Sea in 1976, 1979-1980, 1989-1990, 1995-1996 and 2002-2003. Exact coordinates for stations 1, 4, 5 and 6 are unknown and were calculated using Google-earth program. Data on Ctenophora Mnemiopsis leidyi and Beroe ovata are not included. Juday net: Vertical tows of a Juday net, with mouth area 0.1 m**2, mesh size 150µm. Tows were performed at layers. Towing speed: about 0.5 m/s. Samples were preserved by a 4% formaldehyde sea water buffered solution. Sampling volume was estimated by multiplying the mouth area with the wire length. The collected material was analysed using the method of portions (Yashnov, 1939). Samples were brought to volume of 50 - 100 ml depending upon zooplankton density and mixed intensively until all organisms were distributed randomly in the sample volume. After that 1 ml of sample was taken by calibrated Stempel-pipette. This operation was produced twice. If divergence between two examined subsamples was more than 30% one more subsample was examined. Large (> 1 mm body length) and not abundant species were calculated in 1/2, 1/4, 1/8, 1/16 or 1/32 part of sample. Counting and measuring of organisms were made in the Bogorov chamber under the stereomicroscope to the lowest taxon possible. Number of organisms per sample was calculated as simple average of two subsamples meanings multiplied on subsample volume. Total abundance of mesozooplankton was calculated as sum of taxon-specific abundances and total abundance of Copepods was calculated as sum of copepods taxon-specific abundances.
Resumo:
This article analyses the long-term performance of collective off-grid photovoltaic (PV) systems in rural areas. The use of collective PV systems for the electrification of small medium-size villages in developing countries has increased in the recent years. They are basically set up as stand-alone installations (diesel hybrid or pure PV) with no connection with other electrical grids. Their particular conditions (isolated) and usual installation places (far from commercial/industrial centers) require an autonomous and reliable technology. Different but related factors affect their performance and the energy supply; some of them are strictly technical but others depend on external issues like the solar energy resource and users’ energy and power consumption. The work presented is based on field operation of twelve collective PV installations supplying the electricity to off-grid villages located in the province of Jujuy, Argentina. Five of them have PV generators as unique power source while other seven include the support of diesel groups. Load demand evolution, energy productivity and fuel consumption are analyzed. Besides, energy generation strategies (PV/diesel) are also discussed.
Resumo:
Burn-up credit analyses are based on depletion calculations that provide an accurate prediction of spent fuel isotopic contents, followed by criticality calculations to assess keff
Resumo:
This paper presents an analysis of the fault tolerance achieved by an autonomous, fully embedded evolvable hardware system, which uses a combination of partial dynamic reconfiguration and an evolutionary algorithm (EA). It demonstrates that the system may self-recover from both transient and cumulative permanent faults. This self-adaptive system, based on a 2D array of 16 (4×4) Processing Elements (PEs), is tested with an image filtering application. Results show that it may properly recover from faults in up to 3 PEs, that is, more than 18% cumulative permanent faults. Two fault models are used for testing purposes, at PE and CLB levels. Two self-healing strategies are also introduced, depending on whether fault diagnosis is available or not. They are based on scrubbing, fitness evaluation, dynamic partial reconfiguration and in-system evolutionary adaptation. Since most of these adaptability features are already available on the system for its normal operation, resource cost for self-healing is very low (only some code additions in the internal microprocessor core)
Resumo:
A Near Infrared Spectroscopy (NIRS) industrial application was developed by the LPF-Tagralia team, and transferred to a Spanish dehydrator company (Agrotécnica Extremeña S.L.) for the classification of dehydrator onion bulbs for breeding purposes. The automated operation of the system has allowed the classification of more than one million onion bulbs during seasons 2004 to 2008 (Table 1). The performance achieved by the original model (R2=0,65; SEC=2,28ºBrix) was enough for qualitative classification thanks to the broad range of variation of the initial population (18ºBrix). Nevertheless, a reduction of the classification performance of the model has been observed with the passing of seasons. One of the reasons put forward is the reduction of the range of variation that naturally occurs during a breeding process, the other is the variations in other parameters than the variable of interest but whose effects would probably be affecting the measurements [1]. This study points to the application of Independent Component Analysis (ICA) on this highly variable dataset coming from a NIRS industrial application for the identification of the different sources of variation present through seasons.
Resumo:
Set-Sharing analysis, the classic Jacobs and Langen's domain, has been widely used to infer several interesting properties of programs at compile-time such as occurs-check reduction, automatic parallelization, flnite-tree analysis, etc. However, performing abstract uniflcation over this domain implies the use of a closure operation which makes the number of sharing groups grow exponentially. Much attention has been given in the literature to mitígate this key inefficiency in this otherwise very useful domain. In this paper we present two novel alternative representations for the traditional set-sharing domain, tSH and tNSH. which compress efficiently the number of elements into fewer elements enabling more efficient abstract operations, including abstract uniflcation, without any loss of accuracy. Our experimental evaluation supports that both representations can reduce dramatically the number of sharing groups showing they can be more practical solutions towards scalable set-sharing.
Resumo:
An n(++)-GaAs/p(++)-AlGaAs tunnel junction with a peak current density of 10 100Acm(-2) is developed. This device is a tunnel junction for multijunction solar cells, grown lattice-matched on standard GaAs or Ge substrates, with the highest peak current density ever reported. The voltage drop for a current density equivalent to the operation of the multijunction solar cell up to 10 000 suns is below 5 mV. Trap-assisted tunnelling is proposed to be behind this performance, which cannot be justified by simple band-to-band tunnelling. The metal-organic vapour-phase epitaxy growth conditions, which are in the limits of the transport-limited regime, and the heavy tellurium doping levels are the proposed origins of the defects enabling trap-assisted tunnelling. The hypothesis of trap-assisted tunnelling is supported by the observed annealing behaviour of the tunnel junctions, which cannot be explained in terms of dopant diffusion or passivation. For the integration of these tunnel junctions into a triple-junction solar cell, AlGaAs barrier layers are introduced to suppress the formation of parasitic junctions, but this is found to significantly degrade the performance of the tunnel junctions. However, the annealed tunnel junctions with barrier layers still exhibit a peak current density higher than 2500Acm(-2) and a voltage drop at 10 000 suns of around 20 mV, which are excellent properties for tunnel junctions and mean they can serve as low-loss interconnections in multijunction solar cells working at ultra-high concentrations.
Resumo:
Daily life in urban centers has led to increasing and more demanding freight requirements. Manufacturers, retailers and other urban agents have thus tended towards more frequent and smaller deliveries, resulting in a growing use of light freight vehicles (<3.5 ton). This paper characterizes and analyzes urban freight distribution in order to generate new ways of understanding the phenomenon. Based on a case study of two different-sized Spanish cities using data from GPS, a vehicle observation survey and complementary driver's interviews, the authors propose a categorization of urban freight distribution. The results confirm GPS as a useful tool that allows the integration of dynamic traffic assignment data and diverse traffic operation patterns during different day periods, thereby improving delivery performance.
Resumo:
A broadband primary standard for thermal noise measurements is presented and its thermal and electromagnetic behaviour is analysed by means of a novel hybrid analytical?numerical simulation methodology. The standard consists of a broadband termination connected to a 3.5mm coaxial airline partially immersed in liquid nitrogen and is designed in order to obtain a low reflectivity and a low uncertainty in the noise temperature. A detailed sensitivity analysis is made in order to highlight the critical characteristics that mostly affect the uncertainty in the noise temperature, and also to determine the manufacturing and operation tolerances for a proper performance in the range 10MHz to 26.5 GHz. Aspects such as the thermal bead design, the level of liquid nitrogen or the uncertainties associated with the temperatures, the physical properties of the materials in the standard and the simulation techniques are discussed.
Resumo:
During the first decade of the new millennium, fueled by the economic development in Spain, urban bus services were extended. Since the years 2008 and 2009, the root of the economic crisis, the improvement of these services is at risk due to economic problems. In this paper, the technical efficiency of the main urban bus companies in Spain during the 2004–2009 period are studied using SBM (slack-based measures) models and by establishing the slacks in the services' production inputs. The influence of a series of exogenous variables on the operation of the different services is also analyzed. It is concluded that only the 24% of the case studies are efficient, and some urban form variables can explain part of the inefficiency. The methodology used allows studying the inefficiency in a disaggregated way that other DEA (data envelopment analysis) models do not.
Resumo:
The road to the automation of the agricultural processes passes through the safe operation of the autonomous vehicles. This requirement is a fact in ground mobile units, but it still has not well defined for the aerial robots (UAVs) mainly because the normative and legislation are quite diffuse or even inexistent. Therefore, to define a common and global policy is the challenge to tackle. This characterization has to be addressed from the field experience. Accordingly, this paper presents the work done in this direction, based on the analysis of the most common sources of hazards when using UAV's for agricultural tasks. The work, based on the ISO 31000 normative, has been carried out by applying a three-step structure that integrates the identification, assessment and reduction procedures. The present paper exposes how this method has been applied to analyze previous accidents and malfunctions during UAV operations in order to obtain real failure causes. It has allowed highlighting common risks and hazardous sources and proposing specific guards and safety measures for the agricultural context.
Resumo:
Workflow technology continues to play an important role as a means for specifying and enacting computational experiments in modern science. Reusing and re-purposing workflows allow scientists to do new experiments faster, since the workflows capture useful expertise from others. As workflow libraries grow, scientists face the challenge of finding workflows appropriate for their task, understanding what each workflow does, and reusing relevant portions of a given workflow.We believe that workflows would be easier to understand and reuse if high-level views (abstractions) of their activities were available in workflow libraries. As a first step towards obtaining these abstractions, we report in this paper on the results of a manual analysis performed over a set of real-world scientific workflows from Taverna, Wings, Galaxy and Vistrails. Our analysis has resulted in a set of scientific workflow motifs that outline (i) the kinds of data-intensive activities that are observed in workflows (Data-Operation motifs), and (ii) the different manners in which activities are implemented within workflows (Workflow-Oriented motifs). These motifs are helpful to identify the functionality of the steps in a given workflow, to develop best practices for workflow design, and to develop approaches for automated generation of workflow abstractions.