916 resultados para over-generalization and under-generalization problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on results from five companies in the aerospace and automotive industries to show that over-commitment of technical professionals and under-representation of key skills on technology development and transition teams seriously impairs team performance. The research finds that 40 percent of the projects studied were inadequately staffed, resulting in weaker team communications and alignment. Most importantly, the weak staffing on these teams is found to be associated with a doubling of project failure rate to reach full production. Those weakly staffed teams that did successfully insert technology into production systems were also much more likely than other teams to have development delays and late engineering changes. The conclusion suggests that the expense of project failure, delay and late engineering changes in these companies must greatly out-weigh the savings gained from reduced staffing costs, and that this problem is likely going to be found in other technology-intensive firms intent on seeing project budgets as a cost to be minimized rather than an investment to be maximized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A tanulmányban 25 ország, kétezres évek közepi állapotot tükröző, reprezentatív keresztmetszeti mintáin egyrészt a Duncan-Hoffman-féle modellre támaszkodva megvizsgáljuk, hogy adatbázisunk milyen mértékben tükrözi az illeszkedés bérhozamával foglalkozó irodalom legfontosabb empirikus következtetéseit, másrészt - a Hartog- Oosterbeek-szerzőpáros által javasolt statisztikai próbák segítségével - azt elemezzük, hogy a becslések eredményei alapján mit mondhatunk Mincer emberitőke-, valamint Thurow állásversenymodelljének érvényességéről. Heckman szelekciós torzítást kiküszöbölő becslőfüggvényén alapuló eredményeink jórészt megerősítik az irodalomban vázolt legfontosabb empirikus sajátosságokat, ugyanakkor a statisztikai próbák az országok többségére nézve cáfolják mind az emberi tőke, mind az állásverseny modelljének empirikus érvényességét. / === / Using the Duncan–Hoffman model, the paper estimates returns for educational mismatch using comparable micro data for 25 European countries. The aim is to gauge the extent to which the main empirical regularities shown in other papers on the subject are confirmed by this data base. Based on tests proposed by Hartog and Oosterbeek, the author also considers whether the observed empirical patterns accord with the Mincerian basic human-capital model and Thurow's job-competition model. Heckman's sample-selection estimator shows the returns to be fairly consistent with those found in the literature; the job-competition model and the Mincerian human-capital model can be rejected for most countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Baylis & Driver (Nature Neuroscience, 2001) have recently presented data on the response of neurons in macaque inferotemporal cortex (IT) to various stimulus transformations. They report that neurons can generalize over contrast and mirror reversal, but not over figure-ground reversal. This finding is taken to demonstrate that ``the selectivity of IT neurons is not determined simply by the distinctive contours in a display, contrary to simple edge-based models of shape recognition'', citing our recently presented model of object recognition in cortex (Riesenhuber & Poggio, Nature Neuroscience, 1999). In this memo, I show that the main effects of the experiment can be obtained by performing the appropriate simulations in our simple feedforward model. This suggests for IT cell tuning that the possible contributions of explicit edge assignment processes postulated in (Baylis & Driver, 2001) might be smaller than expected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Countries specialised in tourism tend to face two problems with contradictory effects: the commons and the anti-commons, which lead to tourism over- and under-production, respectively. This paper develops a two-period model to analyse the joint effects of both problems on a small and remote tourism economy. Congestion and the complementariness between foreign transport and local tourism services are key features in this type of markets. As a result, direct selling and the presence of foreign tour-operators emerge as possible market arrangements with different implications in terms of welfare and public intervention. Four main results are obtained. First, in the direct selling situation the optimal policy depends on the relative importance of the problems. Second, the existence of tour-operators always leads to tourism over-production. Third, the presence of a single tour-operator does not solve the congestion problem. Lastly, the switch from several tour-operators to a single one is welfare reducing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The project of articulating a theological ethics on the basis of liturgical anthropology is bound to fail if the necessary consequence is that one has to quit the forum of critical modern rationality. The risk of Engelhardt's approach is to limit rationality to a narrow vision of reason. Sin is not to be understood as the negation of human holiness, but as the negation of divine holiness. The only way to renew theological ethics is to understand sin as the anthropological and ethical expression of the biblical message of the justification by faith only. Sin is therefore a secondary category, which can only by interpreted in light of the positive manifestation of liberation, justification, and grace. The central issue of Christian ethics is not ritual purity or morality, but experience, confession and recognition of our own injustice in our dealing with God and men.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbon dioxide is regarded, nowadays, as a primary anthropogenic greenhouse gas leading to global warming. Hence, chemical fixation of CO2 has attracted much attention as a possible way to manufacture useful chemicals. One of the most interesting approaches of CO2 transformations is the synthesis of organic carbonates. Since conventional production technologies of these compounds involve poisonous phosgene and carbon monoxide, there is a need to develop novel synthetic methods that would better match the principles of "Green Chemistry" towards protection of the environment and human health. Over the years, synthesis of dimethyl carbonate was under intensive investigation in the academia and industry. Therefore, this study was entirely directed towards equally important homologue of carbonic esters family namely diethyl carbonate (DEC). Novel synthesis method of DEC starting from ethanol and CO2 over heterogeneous catalysts based on ceria (CeO2) was studied in the batch reactor. However, the plausible drawback of the reaction is thermodynamic limitations. The calculated values revealed that the reaction is exothermic (ΔrHØ298K = ─ 16.6 J/ ) and does not occur spontaneously at rooms temperature (ΔrGØ 298K = 35.85 kJ/mol). Moreover, co-produced water easily shifts the reaction equilibrium towards reactants excluding achievement of high yields of the carbonate. Therefore, in-situ dehydration has been applied using butylene oxide as a chemical water trap. A 9-fold enhancement in the amount of DEC was observed upon introduction of butylene oxide to the reaction media in comparison to the synthetic method without any water removal. This result confirms that reaction equilibrium was shifted in favour of the desired product and thermodynamic boundaries of the reaction were suppressed by using butylene oxide as a water scavenger. In order to obtain insight into the reaction network, the kinetic experiments were performed over commercial cerium oxide. On the basis of the selectivity/conversion profile it could be concluded that the one-pot synthesis of diethyl carbonate from ethanol, CO2 and butylene oxide occurs via a consecutive route involving cyclic carbonate as an intermediate. Since commercial cerium oxide suffers from the deactivation problems already after first reaction cycle, in-house CeO2 was prepared applying room temperature precipitation technique. Variation of the synthesis parameters such as synthesis time, calcination temperature and pH of the reaction solution turned to have considerable influence on the physico-chemical and catalytic properties of CeO2. The increase of the synthesis time resulted in high specific surface area of cerium oxide and catalyst prepared within 50 h exhibited the highest amount of basic sites on its surface. Furthermore, synthesis under pH 11 yielded cerium oxide with the highest specific surface area, 139 m2/g, among all prepared catalysts. Moreover, CeO2─pH11 catalyst demonstrated the best catalytic activity and 2 mmol of DEC was produced at 180 oC and 9 MPa of the final reaction pressure. In addition, ceria-supported onto high specific surface area silicas MCM-41, SBA-15 and silica gel were synthesized and tested for the first time as catalysts in the synthesis of DEC. Deposition of cerium oxide on MCM-41 and SiO2 supports resulted in a substantial increase of the alkalinity of the carrier materials. Hexagonal SBA-15 modified with 20 wt % of ceria exhibited the second highest basicity in the series of supported catalysts. Evaluation of the catalytic activity of ceria-supported catalysts showed that reaction carried out over 20 wt % CeO2-SBA-15 generated the highest amount of DEC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work analyses the optimal menu of contracts offered by a risk neutral principal to a risk averse agent under moral hazard, adverse selection and limited liability. There are two output levels, whose probability of occurrence are given by agent’s private information choice of effort. The agent’s cost of effort is also private information. First, we show that without assumptions on the cost function, it is not possible to guarantee that the optimal contract menu is simple, when the agent is strictly risk averse. Then, we provide sufficient conditions over the cost function under which it is optimal to offer a single contract, independently of agent’s risk aversion. Our full-pooling cases are caused by non-responsiveness, which is induced by the high cost of enforcing higher effort levels. Also, we show that limited liability generates non-responsiveness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with the joint economic design of (x) over bar and R charts when the occurrence times of assignable causes follow Weibull distributions with increasing failure rates. The variable quality characteristic is assumed to be normally distributed and the process is subject to two independent assignable causes (such as tool wear-out, overheating, or vibration). One cause changes the process mean and the other changes the process variance. However, the occurrence of one kind of assignable cause does not preclude the occurrence of the other. A cost model is developed and a non-uniform sampling interval scheme is adopted. A two-step search procedure is employed to determine the optimum design parameters. Finally, a sensitivity analysis of the model is conducted, and the cost savings associated with the use of non-uniform sampling intervals instead of constant sampling intervals are evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the earliest developments of human history, friction has been a major issue. From the invention of the wheel and the use of the first lubricants to the studiesof coated and microtexturized surfaces, significant effort has been put on improvements that couldovercome the resistance to motion. Areview by Holmberg, Andersson and Erdemir[1] shows that, in an average passenger car, about one third of the total energy consumptionis due to friction losses. Of these, another one third is consumed in the engine system. The optimization of the lubricating oil formulation used ininternal combustion enginesis an important way to reduce friction, therefore improving energeticefficiencyand controllingemissions.Lubrication is also a way to assure the required protection to the system by maintaining wear rates in an adequate level, which helps to minimize maintenance costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ice cover of the Arctic Ocean has been changing dramatically in the last decades and the consequences for the sea-ice associated ecosystem remain difficult to assess. Algal aggregates underneath sea ice have been described sporadically but the frequency and distribution of their occurrence is not well quantified. We used upward looking images obtained by a remotely operated vehicle (ROV) to derive estimates of ice algal aggregate biomass and to investigate their spatial distribution. During the IceArc expedition (ARK-XXVII/3) of RV Polarstern in late summer 2012, different types of algal aggregates were observed floating underneath various ice types in the Central Arctic basins. Our results show that the floe scale distribution of algal aggregates in late summer is very patchy and determined by the topography of the ice underside, with aggregates collecting in dome shaped structures and at the edges of pressure ridges. The buoyancy of the aggregates was also evident from analysis of the aggregate size distribution. Different approaches used to estimate aggregate biomass yield a wide range of results. This highlights that special care must be taken when upscaling observations and comparing results from surveys conducted using different methods or on different spatial scales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The amount of solar radiation transmitted through Arctic sea ice is determined by the thickness and physical properties of snow and sea ice. Light transmittance is highly variable in space and time since thickness and physical properties of snow and sea ice are highly heterogeneous on variable time and length scales. We present field measurements of under-ice irradiance along transects under undeformed land-fast sea ice at Barrow, Alaska (March, May, and June 2010). The measurements were performed with a spectral radiometer mounted on a floating under-ice sled. The objective was to quantify the spatial variability of light transmittance through snow and sea ice, and to compare this variability along its seasonal evolution. Along with optical measurements, snow depth, sea ice thickness, and freeboard were recorded, and ice cores were analyzed for chlorophyll a and particulate matter. Our results show that snow cover variability prior to onset of snow melt causes as much relative spatial variability of light transmittance as the contrast of ponded and white ice during summer. Both before and after melt onset, measured transmittances fell in a range from one third to three times the mean value. In addition, we found a twentyfold increase of light transmittance as a result of partial snowmelt, showing the seasonal evolution of transmittance through sea ice far exceeds the spatial variability. However, prior melt onset, light transmittance was time invariant and differences in under-ice irradiance were directly related to the spatial variability of the snow cover.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurements of solar radiation over and under sea ice have been performed on various stations in the Arctic Ocean during the Polarstern cruise PS92 (TRANSSIZ) between 19 May and 30 June 2015. All radiation measurements have been performed with Ramses spectral radiometers (Trios, Rastede, Germany). All data are given in full spectral resolution interpolated to 1.0 nm, and integrated over the entire wavelength range (broadband, total: 320 to 950 nm). Two sensors were mounted on a Remotely Operated Vehicle (ROV) and one radiometer was installed on the sea ice for surface reference measurements (solar irradiance). On the ROV, one irradiance sensor (cos-collector) for energy budget calculations and one radiance sensor (9° opening angle) to obtain high resolution spatial variability were installed. Along with the radiation measurements, ROV positions were obtained from acoustic USBL-positioning and all parameters of vehicle depth, distance to the ice and attitude recorded. All times are given in UTC.