48 resultados para event investigation
Resumo:
Excess entry refers to the high failure rate of new entrepreneurial ventures. Economic explanations suggest 'hit and run' entrants and risk-seeking behavior. A psychological explanation is that people (entrepreneurs) are overconfident in their abilities (Camerer & Lovallo, 1999). Characterizing entry decisions as ambiguous gambles, we alternatively suggest following Heath and Tversky (1991) that people seek ambiguity when the source of uncertainty is related to their competence. Overconfidence, as such, plays no role. This hypothesis is confirmed in an experimental study that also documents the phenomenon of reference group neglect. Finally, we emphasize the utility that people gain from engaging in activities that contribute to a sense of competence. This is an important force in economic activity that deserves more explicit attention.
Resumo:
In this paper we examine the determinants of wages and decompose theobserved differences across genders into the "explained by differentcharacteristics" and "explained by different returns components"using a sample of Spanish workers. Apart from the conditionalexpectation of wages, we estimate the conditional quantile functionsfor men and women and find that both the absolute wage gap and thepart attributed to different returns at each of the quantiles, farfrom being well represented by their counterparts at the mean, aregreater as we move up in the wage range.
Resumo:
We study the contribution of money to business cycle fluctuations in the US,the UK, Japan, and the Euro area using a small scale structural monetary business cycle model. Constrained likelihood-based estimates of the parameters areprovided and time instabilities analyzed. Real balances are statistically importantfor output and inflation fluctuations. Their contribution changes over time. Models giving money no role provide a distorted representation of the sources of cyclicalfluctuations, of the transmission of shocks and of the events of the last 40 years.
Resumo:
While the theoretical industrial organization literature has long arguedthat excess capacity can be used to deter entry into markets, there islittle empirical evidence that incumbent firms effectively behave in thisway. Bagwell and Ramey (1996) propose a game with a specific sequence ofmoves and partially-recoverable capacity costs in which forward inductionprovides a theoretical rationalization for firm behavior in the field. Weconduct an experiment with a game inspired by their work. In our data theincumbent tends to keep the market, in contrast to what the forwardinduction argument of Bagwell and Ramey would suggest. The results indicatethat players perceive that the first mover has an advantage without havingto pre-commit capacity. In our game, evolution and learning do not driveout this perception. We back these claims with data analysis, atheoretical framework for dynamics, and simulation results.
Resumo:
This paper investigates bilateral trade in banking services within the European Union. The attention has been addressed to two main issues. First, to test the bank's motivations for setting up the different forms of overseas offices, and secondly, to assess the importance of barriers to entry across national European banking systems. Empirical results confirm the existence of different motivations for establishing representative offices, branches and subsidiaries in foreign locations. In addition, evidence has been achieved about the importance of non-regulatory barriers that could make difficult the existence of a single European market for banking services.
Resumo:
This paper investigates whether information about fairness types canbe useful in lowering dispute costs and enhancing bargaining efficiency.An experiment was conducted in which subjects were first screened usinga dictator game, with the allocations chosen used to separate participantsinto two types. Mutually anonymous pairs of subjects then bargained, witha dispute cost structure imposed. Sorting with identification reducesdispute costs; there are also significant differences in bargainingefficiency across pairing types. Information about types is crucial forthese differences and also strongly affects the relative bargainingsuccess of the two types and the hypothetical optimal bargaining strategy.
Resumo:
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.
Resumo:
The 10 June 2000 event was the largest flash flood event that occurred in the Northeast of Spain in the late 20th century, both as regards its meteorological features and its considerable social impact. This paper focuses on analysis of the structures that produced the heavy rainfalls, especially from the point of view of meteorological radar. Due to the fact that this case is a good example of a Mediterranean flash flood event, a final objective of this paper is to undertake a description of the evolution of the rainfall structure that would be sufficiently clear to be understood at an interdisciplinary forum. Then, it could be useful not only to improve conceptual meteorological models, but also for application in downscaling models. The main precipitation structure was a Mesoscale Convective System (MCS) that crossed the region and that developed as a consequence of the merging of two previous squall lines. The paper analyses the main meteorological features that led to the development and triggering of the heavy rainfalls, with special emphasis on the features of this MCS, its life cycle and its dynamic features. To this end, 2-D and 3-D algorithms were applied to the imagery recorded over the complete life cycle of the structures, which lasted approximately 18 h. Mesoscale and synoptic information were also considered. Results show that it was an NS-MCS, quasi-stationary during its stage of maturity as a consequence of the formation of a convective train, the different displacement directions of the 2-D structures and the 3-D structures, including the propagation of new cells, and the slow movement of the convergence line associated with the Mediterranean mesoscale low.
Resumo:
Leakage detection is an important issue in many chemical sensing applications. Leakage detection hy thresholds suffers from important drawbacks when sensors have serious drifts or they are affected by cross-sensitivities. Here we present an adaptive method based in a Dynamic Principal Component Analysis that models the relationships between the sensors in the may. In normal conditions a certain variance distribution characterizes sensor signals. However, in the presence of a new source of variance the PCA decomposition changes drastically. In order to prevent the influence of sensor drifts the model is adaptive and it is calculated in a recursive manner with minimum computational effort. The behavior of this technique is studied with synthetic signals and with real signals arising by oil vapor leakages in an air compressor. Results clearly demonstrate the efficiency of the proposed method.
Resumo:
We present computational approaches as alternatives to a recent microwave cavity experiment by S. Sridhar and A. Kudrolli [Phys. Rev. Lett. 72, 2175 (1994)] on isospectral cavities built from triangles. A straightforward proof of isospectrality is given, based on the mode-matching method. Our results show that the experiment is accurate to 0.3% for the first 25 states. The level statistics resemble those of a Gaussian orthogonal ensemble when the integrable part of the spectrum is removed.
Resumo:
The liquid-liquid critical point scenario of water hypothesizes the existence of two metastable liq- uid phases low-density liquid (LDL) and high-density liquid (HDL) deep within the supercooled region. The hypothesis originates from computer simulations of the ST2 water model, but the stabil- ity of the LDL phase with respect to the crystal is still being debated. We simulate supercooled ST2 water at constant pressure, constant temperature, and constant number of molecules N for N ≤ 729 and times up to 1 μs. We observe clear differences between the two liquids, both structural and dynamical. Using several methods, including finite-size scaling, we confirm the presence of a liquid-liquid phase transition ending in a critical point. We find that the LDL is stable with respect to the crystal in 98% of our runs (we perform 372 runs for LDL or LDL-like states), and in 100% of our runs for the two largest system sizes (N = 512 and 729, for which we perform 136 runs for LDL or LDL-like states). In all these runs, tiny crystallites grow and then melt within 1 μs. Only for N ≤ 343 we observe six events (over 236 runs for LDL or LDL-like states) of spontaneous crystal- lization after crystallites reach an estimated critical size of about 70 ± 10 molecules.
Resumo:
Terrestrial laser scanning (TLS) is one of the most promising surveying techniques for rockslope characterization and monitoring. Landslide and rockfall movements can be detected by means of comparison of sequential scans. One of the most pressing challenges of natural hazards is combined temporal and spatial prediction of rockfall. An outdoor experiment was performed to ascertain whether the TLS instrumental error is small enough to enable detection of precursory displacements of millimetric magnitude. This consists of a known displacement of three objects relative to a stable surface. Results show that millimetric changes cannot be detected by the analysis of the unprocessed datasets. Displacement measurement are improved considerably by applying Nearest Neighbour (NN) averaging, which reduces the error (1¿) up to a factor of 6. This technique was applied to displacements prior to the April 2007 rockfall event at Castellfollit de la Roca, Spain. The maximum precursory displacement measured was 45 mm, approximately 2.5 times the standard deviation of the model comparison, hampering the distinction between actual displacement and instrumental error using conventional methodologies. Encouragingly, the precursory displacement was clearly detected by applying the NN averaging method. These results show that millimetric displacements prior to failure can be detected using TLS.
Resumo:
Sickness absence (SA) is an important social, economic and public health issue. Identifying and understanding the determinants, whether biological, regulatory or, health services-related, of variability in SA duration is essential for better management of SA. The conditional frailty model (CFM) is useful when repeated SA events occur within the same individual, as it allows simultaneous analysis of event dependence and heterogeneity due to unknown, unmeasured, or unmeasurable factors. However, its use may encounter computational limitations when applied to very large data sets, as may frequently occur in the analysis of SA duration. To overcome the computational issue, we propose a Poisson-based conditional frailty model (CFPM) for repeated SA events that accounts for both event dependence and heterogeneity. To demonstrate the usefulness of the model proposed in the SA duration context, we used data from all non-work-related SA episodes that occurred in Catalonia (Spain) in 2007, initiated by either a diagnosis of neoplasm or mental and behavioral disorders. As expected, the CFPM results were very similar to those of the CFM for both diagnosis groups. The CPU time for the CFPM was substantially shorter than the CFM. The CFPM is an suitable alternative to the CFM in survival analysis with recurrent events,especially with large databases.
Resumo:
Recently a fingering morphology, resembling the hydrodynamic Saffman-Taylor instability, was identified in the quasi-two-dimensional electrodeposition of copper. We present here measurements of the dispersion relation of the growing front. The instability is accompanied by gravity-driven convection rolls at the electrodes, which are examined using particle image velocimetry. While at the anode the theory presented by Chazalviel et al. [J. Electroanal. Chem. 407, 61 (1996)] describes the convection roll, the flow field at the cathode is more complicated because of the growing deposit. In particular, the analysis of the orientation of the velocity vectors reveals some lag of the development of the convection roll compared to the finger envelope.