939 resultados para Work organization models
Resumo:
A major infrastructure project is used to investigate the role of digital objects in the coordination of engineering design work. From a practice-based perspective, research emphasizes objects as important in enabling cooperative knowledge work and knowledge sharing. The term ‘boundary object’ has become used in the analysis of mutual and reciprocal knowledge sharing around physical and digital objects. The aim is to extend this work by analysing the introduction of an extranet into the public–private partnership project used to construct a new motorway. Multiple categories of digital objects are mobilized in coordination across heterogeneous, cross-organizational groups. The main findings are that digital objects provide mechanisms for accountability and control, as well as for mutual and reciprocal knowledge sharing; and that different types of objects are nested, forming a digital infrastructure for project delivery. Reconceptualizing boundary objects as a digital infrastructure for delivery has practical implications for management practices on large projects and for the use of digital tools, such as building information models, in construction. It provides a starting point for future research into the changing nature of digitally enabled coordination in project-based work.
Resumo:
Identity issues are under-explored in construction management. We provide a brief introduction to the organization studies literature on subjectively construed identities, focusing on discourse, agency, relations of power and identity work. The construction management literature is investigated in order to examine identity concerns as they relate to construction managers centred on (1) professionalism; (2) ethics; (3) relational aspects of self-identity; (4) competence, knowledge and tools; and (5) national culture. Identity, we argue, is a key performance issue, and needs to be accounted for in explanations of the success and failure of projects. Our overriding concern is to raise identity issues in order to demonstrate their importance to researchers in construction management and to spark debate. The purpose of this work is not to provide answers or to propose prescriptive models, but to explore ideas, raise awareness and to generate questions for further programmatic research. To this end, we promote empirical work and theorizing by outlining elements of a research agenda which argues that 'identity' is a potentially generative theme for scholars in construction management.
Resumo:
As integrated software solutions reshape project delivery, they alter the bases for collaboration and competition across firms in complex industries. This paper synthesises and extends literatures on strategy in project-based industries and digitally-integrated work to understand how project-based firms interact with digital infrastructures for project delivery. Four identified strategies are to: 1) develop and use capabilities to shape the integrated software solutions that are used in projects; 2) co-specialize, developing complementary assets to work repeatedly with a particular integrator firm; 3) retain flexibility by developing and maintaining capabilities in multiple digital technologies and processes; and 4) manage interfaces, translating work into project formats for coordination while hiding proprietary data and capabilities in internal systems. The paper articulates the strategic importance of digital infrastructures for delivery as well as product architectures. It concludes by discussing managerial implications of the identified strategies and areas for further research.
Resumo:
This paper presents recent developments to a vision-based traffic surveillance system which relies extensively on the use of geometrical and scene context. Firstly, a highly parametrised 3-D model is reported, able to adopt the shape of a wide variety of different classes of vehicle (e.g. cars, vans, buses etc.), and its subsequent specialisation to a generic car class which accounts for commonly encountered types of car (including saloon, batchback and estate cars). Sample data collected from video images, by means of an interactive tool, have been subjected to principal component analysis (PCA) to define a deformable model having 6 degrees of freedom. Secondly, a new pose refinement technique using “active” models is described, able to recover both the pose of a rigid object, and the structure of a deformable model; an assessment of its performance is examined in comparison with previously reported “passive” model-based techniques in the context of traffic surveillance. The new method is more stable, and requires fewer iterations, especially when the number of free parameters increases, but shows somewhat poorer convergence. Typical applications for this work include robot surveillance and navigation tasks.
Resumo:
Recent laboratory observations and advances in theoretical quantum chemistry allow a reappraisal of the fundamental mechanisms that determine the water vapour self-continuum absorption throughout the infrared and millimetre wave spectral regions. By starting from a framework that partitions bimolecular interactions between water molecules into free-pair states, true bound and quasi-bound dimers, we present a critical review of recent observations, continuum models and theoretical predictions. In the near-infrared bands of the water monomer, we propose that spectral features in recent laboratory-derived self-continuum can be well explained as being due to a combination of true bound and quasi-bound dimers, when the spectrum of quasi-bound dimers is approximated as being double the broadened spectrum of the water monomer. Such a representation can explain both the wavenumber variation and the temperature dependence. Recent observations of the self-continuum absorption in the windows between these near-infrared bands indicate that widely used continuum models can underestimate the true strength by around an order of magnitude. An existing far-wing model does not appear able to explain the discrepancy, and although a dimer explanation is possible, currently available observations do not allow a compelling case to be made. In the 8–12 micron window, recent observations indicate that the modern continuum models either do not properly represent the temperature dependence, the wavelength variation, or both. The temperature dependence is suggestive of a transition from the dominance of true bound dimers at lower temperatures to quasibound dimers at higher temperatures. In the mid- and far-infrared spectral region, recent theoretical calculations indicate that true bound dimers may explain at least between 20% and 40% of the observed self-continuum. The possibility that quasi-bound dimers could cause an additional contribution of the same size is discussed. Most recent theoretical considerations agree that water dimers are likely to be the dominant contributor to the self-continuum in the mm-wave spectral range.
Resumo:
Organizational issues are inhibiting the implementation and strategic use of information technologies (IT) in the construction sector. This paper focuses on these issues and explores processes by which emerging technologies can be introduced into construction organizations. The paper is based on a case study, conducted in a major house building company that was implementing a virtual reality (VR) system for internal design review in the regional offices. Interviews were conducted with different members of the organization to explore the introduction process and the use of the system. The case study findings provide insight into the process of change, the constraints that inhibit IT implementation and the relationship between new technology and work patterns within construction organizations. They suggest that (1) user-developer communications are critical for the successful implementation of non-diffused innovations in the construction industry; and (2) successful uptake of IT requires both strategic decision-making by top management and decision-making by technical managers.
Resumo:
We introduce the notion that the energy of individuals can manifest as a higher-level, collective construct. To this end, we conducted four independent studies to investigate the viability and importance of the collective energy construct as assessed by a new survey instrument—the productive energy measure (PEM). Study 1 (n = 2208) included exploratory and confirmatory factor analyses to explore the underlying factor structure of PEM. Study 2 (n = 660) cross-validated the same factor structure in an independent sample. In study 3, we administered the PEM to more than 5000 employees from 145 departments located in five countries. Results from measurement invariance, statistical aggregation, convergent, and discriminant-validity assessments offered additional support for the construct validity of PEM. In terms of predictive and incremental validity, the PEM was positively associated with three collective attitudes—units' commitment to goals, the organization, and overall satisfaction. In study 4, we explored the relationship between the productive energy of firms and their overall performance. Using data from 92 firms (n = 5939employees), we found a positive relationship between the PEM (aggregated to the firm level) and the performance of those firms. Copyright © 2011 John Wiley & Sons, Ltd.
Resumo:
Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.
Resumo:
We present an intercomparison and verification analysis of 20 GCMs (Global Circulation Models) included in the 4th IPCC assessment report regarding their representation of the hydrological cycle on the Danube river basin for 1961–2000 and for the 2161–2200 SRESA1B scenario runs. The basin-scale properties of the hydrological cycle are computed by spatially integrating the precipitation, evaporation, and runoff fields using the Voronoi-Thiessen tessellation formalism. The span of the model- simulated mean annual water balances is of the same order of magnitude of the observed Danube discharge of the Delta; the true value is within the range simulated by the models. Some land components seem to have deficiencies since there are cases of violation of water conservation when annual means are considered. The overall performance and the degree of agreement of the GCMs are comparable to those of the RCMs (Regional Climate Models) analyzed in a previous work, in spite of the much higher resolution and common nesting of the RCMs. The reanalyses are shown to feature several inconsistencies and cannot be used as a verification benchmark for the hydrological cycle in the Danubian region. In the scenario runs, for basically all models the water balance decreases, whereas its interannual variability increases. Changes in the strength of the hydrological cycle are not consistent among models: it is confirmed that capturing the impact of climate change on the hydrological cycle is not an easy task over land areas. Moreover, in several cases we find that qualitatively different behaviors emerge among the models: the ensemble mean does not represent any sort of average model, and often it falls between the models’ clusters.
Resumo:
This study examines criteria for the existence of two stable states of the Atlantic Meridional Overturning Circulation (AMOC) using a combination of theory and simulations from a numerical coupled atmosphere–ocean climate model. By formulating a simple collection of state parameters and their relationships, the authors reconstruct the North Atlantic Deep Water (NADW) OFF state behavior under a varying external salt-flux forcing. This part (Part I) of the paper examines the steady-state solution, which gives insight into the mechanisms that sustain the NADW OFF state in this coupled model; Part II deals with the transient behavior predicted by the evolution equation. The nonlinear behavior of the Antarctic Intermediate Water (AAIW) reverse cell is critical to the OFF state. Higher Atlantic salinity leads both to a reduced AAIW reverse cell and to a greater vertical salinity gradient in the South Atlantic. The former tends to reduce Atlantic salt export to the Southern Ocean, while the latter tends to increases it. These competing effects produce a nonlinear response of Atlantic salinity and salt export to salt forcing, and the existence of maxima in these quantities. Thus the authors obtain a natural and accurate analytical saddle-node condition for the maximal surface salt flux for which a NADW OFF state exists. By contrast, the bistability indicator proposed by De Vries and Weber does not generally work in this model. It is applicable only when the effect of the AAIW reverse cell on the Atlantic salt budget is weak.
Resumo:
The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.
Resumo:
We review and structure some of the mathematical and statistical models that have been developed over the past half century to grapple with theoretical and experimental questions about the stochastic development of aging over the life course. We suggest that the mathematical models are in large part addressing the problem of partitioning the randomness in aging: How does aging vary between individuals, and within an individual over the lifecourse? How much of the variation is inherently related to some qualities of the individual, and how much is entirely random? How much of the randomness is cumulative, and how much is merely short-term flutter? We propose that recent lines of statistical inquiry in survival analysis could usefully grapple with these questions, all the more so if they were more explicitly linked to the relevant mathematical and biological models of aging. To this end, we describe points of contact among the various lines of mathematical and statistical research. We suggest some directions for future work, including the exploration of information-theoretic measures for evaluating components of stochastic models as the basis for analyzing experiments and anchoring theoretical discussions of aging.
Resumo:
This paper arises from a doctoral thesis comparing the impact of alternative installer business models on the rate at which microgeneration is taken up in homes and installation standards across the UK. The paper presents the results of the first large-scale academic survey of businesses certified to install residential microgeneration. The aim is to systematically capture those characteristics which define the business model of each surveyed company, and relate these to the number, location and type of technologies that they install, and the quality of these installations. The methodology comprised a pilot web survey of 235 certified installer businesses, which was carried out in June last year and achieved a response rate of 30%. Following optimisation of the design, the main web survey was emailed to over 2000 businesses between October and December 2011, with 317 valid responses received. The survey is being complemented during summer 2012 by semi-structured interviews with a representative sample of installers who completed the main survey. The survey results are currently being analysed. The early results indicate an emerging and volatile market where solar PV, solar hot water and air source heat pumps are the dominant technologies. Three quarters of respondents are founders of their installer business, while only 22 businesses are owned by another company. Over half of the 317 businesses have five employees or less, while 166 businesses are no more than four years old. In addition, half of the businesses stated that 100% of their employees work on microgeneration-related activities. 85% of the surveyed companies have only one business location in the UK. A third of the businesses are based either in the South West or South East regions of England. This paper outlines the interim results of the survey combined with the outcomes from additional interviews with installers to date. The research identifies some of the business models underpinning microgeneration installers and some of the ways in which installer business models impact on the rate and standards of microgeneration uptake. A tentative conclusion is that installer business models are profoundly dependent on the levels and timing of support from the UK Feed-in Tariffs and Renewable Heat Incentive.
Resumo:
High-resolution simulations over a large tropical domain (∼20◦S–20◦N and 42◦E–180◦E) using both explicit and parameterized convection are analyzed and compared to observations during a 10-day case study of an active Madden-Julian Oscillation (MJO) event. The parameterized convection model simulations at both 40 km and 12 km grid spacing have a very weak MJO signal and little eastward propagation. A 4 km explicit convection simulation using Smagorinsky subgrid mixing in the vertical and horizontal dimensions exhibits the best MJO strength and propagation speed. 12 km explicit convection simulations also perform much better than the 12 km parameterized convection run, suggesting that the convection scheme, rather than horizontal resolution, is key for these MJO simulations. Interestingly, a 4 km explicit convection simulation using the conventional boundary layer scheme for vertical subgrid mixing (but still using Smagorinsky horizontal mixing) completely loses the large-scale MJO organization, showing that relatively high resolution with explicit convection does not guarantee a good MJO simulation. Models with a good MJO representation have a more realistic relationship between lower-free-tropospheric moisture and precipitation, supporting the idea that moisture-convection feedback is a key process for MJO propagation. There is also increased generation of available potential energy and conversion of that energy into kinetic energy in models with a more realistic MJO, which is related to larger zonal variance in convective heating and vertical velocity, larger zonal temperature variance around 200 hPa, and larger correlations between temperature and ascent (and between temperature and diabatic heating) between 500–400 hPa.