737 resultados para neglect
Resumo:
O problema que se circunscreve no presente estudo é o sistema de relações entre a confiança organizacional, o comprometimento organizacional e a estratégia comportamental negligência. As principais hipóteses são de que o Construto Confiança influencia a estratégia Comportamental Negligência e que esta relação é mediada pelo Comprometimento Organizacional perspetivado no Modelo das Três Componentes de Allen e Meyer (1991). Foi elaborado um questionário com base em três escalas – Escala de Confiança de Robinson (1996), Escala de Comprometimento Organizacional desenvolvida por Allen e Meyer (1997) e validada para a população portuguesa por Nascimento, Lopes e Salgueiro (2008) e a Escala do Modelo EVLN com plataforma de construção em três escalas utilizadas: uma por Rusbult et al (1998), outra por Withey e Cooper (1989) e outra por Hagedoorn et al (1999), que foi aplicado a uma amostra aleatória. A revisão de literatura do presente artigo teve base em artigos e livros da área organizacional. Identificaram-se como principais evidências que a Confiança influencia a Negligência e de que o Comprometimento Afetivo é moderador desta relação.
Resumo:
Measurements of the top‐of‐the‐atmosphere outgoing longwave radiation (OLR) for July 2003 from Meteosat‐7 are used to assess the performance of the numerical weather prediction version of the Met Office Unified Model. A significant difference is found over desert regions of northern Africa where the model emits too much OLR by up to 35 Wm−2 in the monthly mean. By cloud‐screening the data we find an error of up to 50 Wm−2 associated with cloud‐free areas, which suggests an error in the model surface temperature, surface emissivity, or atmospheric transmission. By building up a physical model of the radiative properties of mineral dust based on in situ, and surface‐based and satellite remote sensing observations we show that the most plausible explanation for the discrepancy in OLR is due to the neglect of mineral dust in the model. The calculations suggest that mineral dust can exert a longwave radiative forcing by as much as 50 Wm−2 in the monthly mean for 1200 UTC in cloud‐free regions, which accounts for the discrepancy between the model and the Meteosat‐7 observations. This suggests that inclusion of the radiative effects of mineral dust will lead to a significant improvement in the radiation balance of numerical weather prediction models with subsequent improvements in performance.
Resumo:
Although the potential importance of scattering of long-wave radiation by clouds has been recognised, most studies have concentrated on the impact of high clouds and few estimates of the global impact of scattering have been presented. This study shows that scattering in low clouds has a significant impact on outgoing long-wave radiation (OLR) in regions of marine stratocumulus (-3.5 W m(-2) for overcast conditions) where the column water vapour is relatively low. This corresponds to an enhancement of the greenhouse effect of such clouds by 10%. The near-global impact of scattering on OLR is estimated to be -3.0 W m(-2), with low clouds contributing -0.9 W m(-2), mid-level cloud -0.7 W m(-2) and high clouds -1.4 W m(-2). Although this effect appears small compared to the global mean OLR of 240 W m(-2), it indicates that neglect of scattering will lead to an error in cloud long-wave forcing of about 10% and an error in net cloud forcing of about 20%.
Resumo:
The annual and interannual variability of idealized, linear, equatorial waves in the lower stratosphere is investigated using the temperature and velocity fields from the ECMWF 15-year re-analysis dataset. Peak Kelvin wave activity occurs during solstice seasons at 100 hPa, during December-February at 70 hPa and in the easterly to westerly quasi-biennial oscillation (QBO) phase transition at 50 hPa. Peak Rossby-gravity wave activity occurs during equinox seasons at 100 hPa, during June-August/September-November at 70 hPa and in the westerly to easterly QBO phase transition at 50 hPa. Although neglect of wind shear means that the results for inertio-gravity waves are likely to be less accurate, they are still qualitatively reasonable and an annual cycle is observed in these waves at 100 hPa and 70 hPa. Inertio-gravity waves with n = 1 are correlated with the QBO at 50 hPa, but the eastward inertio-gravity n = 0 wave is not, due to its very fast vertical group velocity in all background winds. The relative importance of different wave types in driving the QBO at 50 hPa is also discussed. The strongest acceleration appears to be provided by the Kelvin wave while the acceleration provided by the Rossby-gravity wave is negligible. Of the higher-frequency waves, the westward inertio-gravity n = 1 wave appears able to contribute more to the acceleration of the 50 hPa mean zonal wind than the eastward inertio-gravity n = 1 wave.
Resumo:
We compare laboratory observations of equilibrated baroclinic waves in the rotating two-layer annulus, with numerical simulations from a quasi-geostrophic model. The laboratory experiments lie well outside the quasi-geostrophic regime: the Rossby number reaches unity; the depth-to-width aspect ratio is large; and the fluid contains ageostrophic inertia–gravity waves. Despite being formally inapplicable, the quasi-geostrophic model captures the laboratory flows reasonably well. The model displays several systematic biases, which are consequences of its treatment of boundary layers and neglect of interfacial surface tension and which may be explained without invoking the dynamical effects of the moderate Rossby number, large aspect ratio or inertia–gravity waves. We conclude that quasi-geostrophic theory appears to continue to apply well outside its formal bounds.
Resumo:
The absorption intensities of the two infra-red active vibrations in methane have been obtained from a perturbation calculation on the equilibrium wave functions derived in the preceding paper. The perturbation field is the change in the potential field due to the nuclei which results from moving the nuclei in the vibrational coordinate concerned, and a simplified form of second order perturbation theory, developed by Pople and Schofield, is used for the calculation. The main approximation involved is the neglect of f and higher harmonics in the spherical harmonic expansion of the nuclear field. The resulting dipole moment derivatives are approximately three times larger than the experimental values, but they show qualitative features and sign relationships which are significant.
Resumo:
Two formulations for the potential energy for slantwise motion are compared: one which applies strictly only to two-dimensional flows (SCAPE) and a three-dimensional formulation based on a Bernoulli equation. The two formulations share an identical contribution from the vertically integrated buoyancy anomaly and a contribution from different Coriolis terms. The latter arise from the neglect of (different) components of the total change in kinetic energy along a trajectory in the two formulations. This neglect is necessary in order to quantify the potential energy available for slantwise motion relative to a defined steady environment. Copyright © 2000 Royal Meteorological Society.
Resumo:
The assumption that negligible work is involved in the formation of new surfaces in the machining of ductile metals, is re-examined in the light of both current Finite Element Method (FEM) simulations of cutting and modern ductile fracture mechanics. The work associated with separation criteria in FEM models is shown to be in the kJ/m2 range rather than the few J/m2 of the surface energy (surface tension) employed by Shaw in his pioneering study of 1954 following which consideration of surface work has been omitted from analyses of metal cutting. The much greater values of surface specific work are not surprising in terms of ductile fracture mechanics where kJ/m2 values of fracture toughness are typical of the ductile metals involved in machining studies. This paper shows that when even the simple Ernst–Merchant analysis is generalised to include significant surface work, many of the experimental observations for which traditional ‘plasticity and friction only’ analyses seem to have no quantitative explanation, are now given meaning. In particular, the primary shear plane angle φ becomes material-dependent. The experimental increase of φ up to a saturated level, as the uncut chip thickness is increased, is predicted. The positive intercepts found in plots of cutting force vs. depth of cut, and in plots of force resolved along the primary shear plane vs. area of shear plane, are shown to be measures of the specific surface work. It is demonstrated that neglect of these intercepts in cutting analyses is the reason why anomalously high values of shear yield stress are derived at those very small uncut chip thicknesses at which the so-called size effect becomes evident. The material toughness/strength ratio, combined with the depth of cut to form a non-dimensional parameter, is shown to control ductile cutting mechanics. The toughness/strength ratio of a given material will change with rate, temperature, and thermomechanical treatment and the influence of such changes, together with changes in depth of cut, on the character of machining is discussed. Strength or hardness alone is insufficient to describe machining. The failure of the Ernst–Merchant theory seems less to do with problems of uniqueness and the validity of minimum work, and more to do with the problem not being properly posed. The new analysis compares favourably and consistently with the wide body of experimental results available in the literature. Why considerable progress in the understanding of metal cutting has been achieved without reference to significant surface work is also discussed.
Resumo:
This commentary seeks to complement the contribution of the Building Research & Information special issue on 'Developing Theories for the Built Environment' (2008) by highlighting the important role of middle-range theories within the context of professional practice. Middle-range theories provide a form of theorizing that lies between abstract grand theorizing and atheoretical local descriptions. They are also characterized by the way in which they directly engage with the concerns of practitioners. In the context of professional practice, any commitment to theorizing should habitually be combined with an equivalent commitment to empirical research; rarely is it appropriate to neglect one in favour of the other. Any understanding of the role that theory plays in professional practice must further be informed by Schon's seminal ideas on reflective practice. Practitioners are seen to utilize theories as inputs to a process of continuous reflection, thereby guarding against complacency and routinization. The authors would challenge any assumption that academics alone are responsible for generating theories, thereby limiting the role of practitioners to their application. Such a dichotomized view is contrary to established ideas on Mode 2 knowledge production and current trends towards co-production research in the context of the built environment.
Resumo:
The tendency to neglect base-rates in judgment under uncertainty may be "notorious," as Barbey & Sloman (B&S) suggest, but it is neither inevitable (as they document; see also Koehler 1996) nor unique. Here we would like to point out another line of evidence connecting ecological rationality to dual processes, the failure of individuals to appropriately judge cumulative probability.
Resumo:
Perceptual multimedia quality is of paramount importance to the continued take-up and proliferation of multimedia applications: users will not use and pay for applications if they are perceived to be of low quality. Whilst traditionally distributed multimedia quality has been characterised by Quality of Service (QoS) parameters, these neglect the user perspective of the issue of quality. In order to redress this shortcoming, we characterise the user multimedia perspective using the Quality of Perception (QoP) metric, which encompasses not only a user’s satisfaction with the quality of a multimedia presentation, but also his/her ability to analyse, synthesise and assimilate informational content of multimedia. In recognition of the fact that monitoring eye movements offers insights into visual perception, as well as the associated attention mechanisms and cognitive processes, this paper reports on the results of a study investigating the impact of differing multimedia presentation frame rates on user QoP and eye path data. Our results show that provision of higher frame rates, usually assumed to provide better multimedia presentation quality, do not significantly impact upon the median coordinate value of eye path data. Moreover, higher frame rates do not significantly increase level of participant information assimilation, although they do significantly improve overall user enjoyment and quality perception of the multimedia content being shown.
Resumo:
This paper examines the changes in the length of commercial property leases over the last decade and presents an analysis of the consequent investment and occupational pricing implications for commercial property investmentsIt is argued that the pricing implications of a short lease to an investor are contingent upon the expected costs of the letting termination to the investor, the probability that the letting will be terminated and the volatility of rental values.The paper examines the key factors influencing these variables and presents a framework for incorporating their effects into pricing models.Approaches to their valuation derived from option pricing are critically assessed. It is argued that such models also tend to neglect the price effects of specific risk factors such as tenant circumstances and the terms of break clause. Specific risk factors have a significant bearing on the probability of letting termination and on the level of the resultant financial losses. The merits of a simulation methododology are examined for rental and capital valuations of short leases and properties with break clauses.It is concluded that in addition to the rigour of its internal logic, the success of any methodology is predicated upon the accuracy of the inputs.The lack of reliable data on patterns in, and incidence of, lease termination and the lack of reliable time series of historic property performance limit the efficacy of financial models.
Resumo:
The reduction in southern midlatitude ozone is quantified by evaluating the trajectories of ozone-depleted air masses, assuming that photochemical recovery of ozone in advected air parcels can be ignored. This procedure is carried out for the 3 months from 15 October to 15 January for each of the years 1998, 1999, and 2000. Two distinct source regions, the vortex core and the vortex edge, are considered, and for each day, diabatic reverse domain filling calculations are performed for an ensemble of parcels between 30°S and 60°S and 400–700 K in altitude. In 1998, 1999, and 2000 the mean calculated ozone reduction is 16, 18, and 19 DU, respectively. Air parcels from the vortex edge region are significant contributors to the reduction, especially during spring. Results for four longitudinal and three latitudinal midlatitude subregions are also presented. A comparison with the Total Ozone Mapping Spectrometer measurements of total column ozone shows that without the dilution, ozone over Southern Hemisphere midlatitudes would be 5–6% higher during spring and summer. This result is probably an overestimate due to the neglect of photochemical recovery.
Resumo:
If an export subsidy is efficient, that is, has a surplus-transfer role, then there exists an implicit function relating the optimal level of the subsidy to the income target in the agricultural sector. If an export subsidy is inefficient no such function exists. We show that dependence exists in large-export equilibrium, not in small-export equilibrium and show that these results remain robust to concerns about domestic tax distortions. The failure of previous work to produce this result stems from its neglect of the income constraint on producer surplus in the programming problem transferring surplusfrom consumersand taxpayers to farmers.
Resumo:
Two sources of bias arise in conventional loss predictions in the wake of natural disasters. One source of bias stems from neglect of accounting for animal genetic resource loss. A second source of bias stems from failure to identify, in addition to the direct effects of such loss, the indirect effects arising from implications impacting animal-human interactions. We argue that, in some contexts, the magnitude of bias imputed by neglecting animal genetic resource stocks is substantial. We show, in addition, and contrary to popular belief, that the biases attributable to losses in distinct genetic resource stocks are very likely to be the same. We derive the formal equivalence across the distinct resource stocks by deriving an envelope result in a model that forms the mainstay of enquiry in subsistence farming and we validate the theory, empirically, in a World-Society-for-the-Protection-of-Animals application