851 resultados para Large-scale enterprises


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to extend understanding of how large firms pursuing sustained and profitable growth manage organisational renewal. A multiple-case study was conducted in 27 North American and European wood-industry companies, of which 11 were chosen for closer study. The study combined the organisational-capabilities approach to strategic management with corporate-entrepreneurship thinking. It charted the further development of an identification and classification system for capabilities comprising three dimensions: (i) the dynamism between firm-specific and industry-significant capabilities, (ii) hierarchies of capabilities and capability portfolios, and (iii) their internal structure. Capability building was analysed in the context of the organisational design, the technological systems and the type of resource-bundling process (creating new vs. entrenching existing capabilities). The thesis describes the current capability portfolios and the organisational changes in the case companies. It also clarifies the mechanisms through which companies can influence the balance between knowledge search and the efficiency of knowledge transfer and integration in their daily business activities, and consequently the diversity of their capability portfolio and the breadth and novelty of their product/service range. The largest wood-industry companies of today must develop a seemingly dual strategic focus: they have to combine leading-edge, innovative solutions with cost-efficient, large-scale production. The use of modern technology in production was no longer a primary source of competitiveness in the case companies, but rather belonged to the portfolio of basic capabilities. Knowledge and information management had become an industry imperative, on a par with cost effectiveness. Yet, during the period of this research, the case companies were better in supporting growth in volume of the existing activity than growth through new economic activities. Customer-driven, incremental innovation was preferred over firm-driven innovation through experimentation. The three main constraints on organisational renewal were the lack of slack resources, the aim for lean, centralised designs, and the inward-bound communication climate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The development and changes in the distribution of herbivorous mammal communities during the Neogene is complex. The Eurasian scale environmental patterns reflect the large scale geographical and climatic patterns. The reorganization of these affect the biome distribution throughout the continent. The distribution of mammal taxa was closely associated with the distribution of biomes. In Eurasia the Neogene development of environments was twofold. The early and middle Miocene that seemed to have been advantageous for mammals was followed by drying of environments during the late Neogene. The mid-latitude drying was the main trend, and it is the combined result of the retreat of Paratethys, the uplift of Tibetan Plateau and changes in the ocean currents and temperatures. The common mammals were "driving" the evolution of mammalian communities. During the late Miocene we see the drying affecting more and more regions, and we see changes in the composition of mammalian communities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Reproductive efficiency is an important determinant of profitable cattle breeding systems and the success of assisted reproductive techniques (ART) in wildlife conservation programs. Methods of estrous detection used in intensive beef and dairy cattle systems lack accuracy and remain the single biggest issue for improvement of reproductive rates and such methods are not practical for either large-scale extensive beef cattle enterprises or free-living mammalian species. Recent developments in UHF (ultra high frequency) proximity logger telemetry devices have been used to provide a continuous pair-wise measure of associations between individual animals for both livestock and wildlife. The objective of this study was to explore the potential of using UHF telemetry to identify the reproductive cycle phenotype in terms of intensity and duration of estrus. The study was conducted using Belmont Red (interbred Africander Brahman Hereford–Shorthorn) cattle grazing irrigated pasture on Belmont Research Station, northeastern Australia. The cow-bull associations from three groups of cows each with one bull were recorded over a 7-week breeding season and the stage of estrus was identified using ultrasonography. Telemetry data from bull and cows, collected over 4 8-day logger deployments, were log transformed and analyzed by ANOVA. Both the number and duration of bull-cow affiliations were significantly (P < 0.001) greater in estrous cows compared to anestrus cows. These results support the development of the UHF technology as a hands-off and noninvasive means of gathering socio-sexual information on both wildlife and livestock for reproductive management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Metabolism is the cellular subsystem responsible for generation of energy from nutrients and production of building blocks for larger macromolecules. Computational and statistical modeling of metabolism is vital to many disciplines including bioengineering, the study of diseases, drug target identification, and understanding the evolution of metabolism. In this thesis, we propose efficient computational methods for metabolic modeling. The techniques presented are targeted particularly at the analysis of large metabolic models encompassing the whole metabolism of one or several organisms. We concentrate on three major themes of metabolic modeling: metabolic pathway analysis, metabolic reconstruction and the study of evolution of metabolism. In the first part of this thesis, we study metabolic pathway analysis. We propose a novel modeling framework called gapless modeling to study biochemically viable metabolic networks and pathways. In addition, we investigate the utilization of atom-level information on metabolism to improve the quality of pathway analyses. We describe efficient algorithms for discovering both gapless and atom-level metabolic pathways, and conduct experiments with large-scale metabolic networks. The presented gapless approach offers a compromise in terms of complexity and feasibility between the previous graph-theoretic and stoichiometric approaches to metabolic modeling. Gapless pathway analysis shows that microbial metabolic networks are not as robust to random damage as suggested by previous studies. Furthermore the amino acid biosynthesis pathways of the fungal species Trichoderma reesei discovered from atom-level data are shown to closely correspond to those of Saccharomyces cerevisiae. In the second part, we propose computational methods for metabolic reconstruction in the gapless modeling framework. We study the task of reconstructing a metabolic network that does not suffer from connectivity problems. Such problems often limit the usability of reconstructed models, and typically require a significant amount of manual postprocessing. We formulate gapless metabolic reconstruction as an optimization problem and propose an efficient divide-and-conquer strategy to solve it with real-world instances. We also describe computational techniques for solving problems stemming from ambiguities in metabolite naming. These techniques have been implemented in a web-based sofware ReMatch intended for reconstruction of models for 13C metabolic flux analysis. In the third part, we extend our scope from single to multiple metabolic networks and propose an algorithm for inferring gapless metabolic networks of ancestral species from phylogenetic data. Experimenting with 16 fungal species, we show that the method is able to generate results that are easily interpretable and that provide hypotheses about the evolution of metabolism.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The increase in global temperature has been attributed to increased atmospheric concentrations of greenhouse gases (GHG), mainly that of CO2. The threat of severe and complex socio-economic and ecological implications of climate change have initiated an international process that aims to reduce emissions, to increase C sinks, and to protect existing C reservoirs. The famous Kyoto protocol is an offspring of this process. The Kyoto protocol and its accords state that signatory countries need to monitor their forest C pools, and to follow the guidelines set by the IPCC in the preparation, reporting and quality assessment of the C pool change estimates. The aims of this thesis were i) to estimate the changes in carbon stocks vegetation and soil in the forests in Finnish forests from 1922 to 2004, ii) to evaluate the applied methodology by using empirical data, iii) to assess the reliability of the estimates by means of uncertainty analysis, iv) to assess the effect of forest C sinks on the reliability of the entire national GHG inventory, and finally, v) to present an application of model-based stratification to a large-scale sampling design of soil C stock changes. The applied methodology builds on the forest inventory measured data (or modelled stand data), and uses statistical modelling to predict biomasses and litter productions, as well as a dynamic soil C model to predict the decomposition of litter. The mean vegetation C sink of Finnish forests from 1922 to 2004 was 3.3 Tg C a-1, and in soil was 0.7 Tg C a-1. Soil is slowly accumulating C as a consequence of increased growing stock and unsaturated soil C stocks in relation to current detritus input to soil that is higher than in the beginning of the period. Annual estimates of vegetation and soil C stock changes fluctuated considerably during the period, were frequently opposite (e.g. vegetation was a sink but soil was a source). The inclusion of vegetation sinks into the national GHG inventory of 2003 increased its uncertainty from between -4% and 9% to ± 19% (95% CI), and further inclusion of upland mineral soils increased it to ± 24%. The uncertainties of annual sinks can be reduced most efficiently by concentrating on the quality of the model input data. Despite the decreased precision of the national GHG inventory, the inclusion of uncertain sinks improves its accuracy due to the larger sectoral coverage of the inventory. If the national soil sink estimates were prepared by repeated soil sampling of model-stratified sample plots, the uncertainties would be accounted for in the stratum formation and sample allocation. Otherwise, the increases of sampling efficiency by stratification remain smaller. The highly variable and frequently opposite annual changes in ecosystem C pools imply the importance of full ecosystem C accounting. If forest C sink estimates will be used in practice average sink estimates seem a more reasonable basis than the annual estimates. This is due to the fact that annual forest sinks vary considerably and annual estimates are uncertain, and they have severe consequences for the reliability of the total national GHG balance. The estimation of average sinks should still be based on annual or even more frequent data due to the non-linear decomposition process that is influenced by the annual climate. The methodology used in this study to predict forest C sinks can be transferred to other countries with some modifications. The ultimate verification of sink estimates should be based on comparison to empirical data, in which case the model-based stratification presented in this study can serve to improve the efficiency of the sampling design.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Scales provide optical disguise, low water drag and mechanical protection to fish, enabling them to survive catastrophic environmental disasters, predators and microorganisms. The unique structures and stacking sequences of fish scales inspired the fabrication of artificial nanostructures with salient optical, interfacial and mechanical properties. Herein, we describe fish-scale bio-inspired multifunctional ZnO nanostructures that have similar morphology and structure to the cycloid scales of the Asian Arowana. These nanostructured coatings feature tunable light refraction and reflection, modulated surface wettability and damage-tolerant mechanical properties. The salient properties of these multifunctional nanostructures are promising for applications in: - (i) optical coatings, sensing or lens arrays for use in reflective displays, packing, advertising and solar energy harvesting; - (ii) self-cleaning surfaces, including anti-smudge, anti-fouling and anti-fogging, and self-sterilizing surfaces, and; - (iii) mechanical/chemical barrier coatings. This study provides a low-cost and large-scale production method for the facile fabrication of these bio-inspired nanostructures and provides new insights for the development of novel functional materials for use in 'smart' structures and applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of social networking has exploded, with millions of people using various web- and mobile-based services around the world. This increase in social networking use has led to user anxiety related to privacy and the unauthorised exposure of personal information. Large-scale sharing in virtual spaces means that researchers, designers and developers now need to re-consider the issues and challenges of maintaining privacy when using social networking services. This paper provides a comprehensive survey of the current state-of-the-art privacy in social networks for both desktop and mobile uses and devices from various architectural vantage points. The survey will assist researchers and analysts in academia and industry to move towards mitigating many of the privacy issues in social networks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The ultrafast vibrational phase relaxation of O–H stretch in bulk water is investigated in molecular dynamics simulations. The dephasing time (T2) of the O–H stretch in bulk water calculated from the frequency fluctuation time correlation function (Cω(t)) is in the range of 70–80 femtosecond (fs), which is comparable to the characteristic timescale obtained from the vibrational echo peak shift measurements using infrared photon echo [W.P. de Boeij, M.S. Pshenichnikov, D.A. Wiersma, Ann. Rev. Phys. Chem. 49 (1998) 99]. The ultrafast decay of Cω(t) is found to be responsible for the ultrashort T2 in bulk water. Careful analysis reveals the following two interesting reasons for the ultrafast decay of Cω(t). (A) The large amplitude angular jumps of water molecules (within 30–40 fs time duration) provide a large scale contribution to the mean square vibrational frequency fluctuation and gives rise to the rapid spectral diffusion on 100 fs time scale. (B) The projected force, due to all the atoms of the solvent molecules on the oxygen (FO(t)) and hydrogen (FH(t)) atom of the O–H bond exhibit a large negative cross-correlation (NCC). We further find that this NCC is partly responsible for a weak, non-Arrhenius temperature dependence of the dephasing rate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction The rapidly burgeoning popularity of cinema at the beginning of the 20th century favored industrialized modes of creativity organized around large production studios that could churn out a steady stream of narrative feature films. By the mid-1910s, a handful of Hollywood studios became leaders in the production, distribution, and exhibition of popular commercial movies. In order to serve incessant demand for new titles, the studios relied on a set of conventions that allowed them to regularize production and realize workplace efficiencies. This entailed a socialized mode of creativity that would later be adopted by radio and television broadcasters. It would also become a model for cinema and media production around the world, both for commercial and state-supported institutions. Even today the core tenets of industrialized creativity prevail in most large media enterprises. During the 1980s and 1990s, however, media industries began to change radically, driven by forces of neoliberalism, corporate conglomeration, globalization, and technological innovation. Today, screen media are created both by large-scale production units and by networked ensembles of talent and skilled labor. Moreover, digital media production may take place in small shops or via the collective labor of media users or fans who have attracted attention due to their hyphenated status as both producers and users of media (i.e., “prosumers”). Studies of screen media labor fall into five conceptual and methodological categories: historical studies of labor relations, ethnographically inspired investigations of workplace dynamics, critical analyses of the spatial and social organization of labor, and normative assessments of industrialized creativity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper describes the sensitivity of the simulated precipitation to changes in convective relaxation time scale (TAU) of Zhang and McFarlane (ZM) cumulus parameterization, in NCAR-Community Atmosphere Model version 3 (CAM3). In the default configuration of the model, the prescribed value of TAU, a characteristic time scale with which convective available potential energy (CAPE) is removed at an exponential rate by convection, is assumed to be 1 h. However, some recent observational findings suggest that, it is larger by around one order of magnitude. In order to explore the sensitivity of the model simulation to TAU, two model frameworks have been used, namely, aqua-planet and actual-planet configurations. Numerical integrations have been carried out by using different values of TAU, and its effect on simulated precipitation has been analyzed. The aqua-planet simulations reveal that when TAU increases, rate of deep convective precipitation (DCP) decreases and this leads to an accumulation of convective instability in the atmosphere. Consequently, the moisture content in the lower-and mid-troposphere increases. On the other hand, the shallow convective precipitation (SCP) and large-scale precipitation (LSP) intensify, predominantly the SCP, and thus capping the accumulation of convective instability in the atmosphere. The total precipitation (TP) remains approximately constant, but the proportion of the three components changes significantly, which in turn alters the vertical distribution of total precipitation production. The vertical structure of moist heating changes from a vertically extended profile to a bottom heavy profile, with the increase of TAU. Altitude of the maximum vertical velocity shifts from upper troposphere to lower troposphere. Similar response was seen in the actual-planet simulations. With an increase in TAU from 1 h to 8 h, there was a significant improvement in the simulation of the seasonal mean precipitation. The fraction of deep convective precipitation was in much better agreement with satellite observations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Large eddy simulation (LES) is an emerging technique for obtaining an approximation to turbulent flow fields It is an improvement over the widely prevalent practice of obtaining means of turbulent flows when the flow has large scale, low frequency, unsteadiness An introduction to the method, its general formulation, and the more common modelling for flows without reaction, is discussed Some attempts at extension to flows with combustion have been made Examples from present work for flows with and without combustion are given The final example of the LES of the combustor of a helicopter engine illustrates the state-of-the-art in application of the technique

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a novel Second Order Cone Programming (SOCP) formulation for large scale binary classification tasks. Assuming that the class conditional densities are mixture distributions, where each component of the mixture has a spherical covariance, the second order statistics of the components can be estimated efficiently using clustering algorithms like BIRCH. For each cluster, the second order moments are used to derive a second order cone constraint via a Chebyshev-Cantelli inequality. This constraint ensures that any data point in the cluster is classified correctly with a high probability. This leads to a large margin SOCP formulation whose size depends on the number of clusters rather than the number of training data points. Hence, the proposed formulation scales well for large datasets when compared to the state-of-the-art classifiers, Support Vector Machines (SVMs). Experiments on real world and synthetic datasets show that the proposed algorithm outperforms SVM solvers in terms of training time and achieves similar accuracies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Following the seminal work of Charney and Shukla (198 1), the tropical climate is recognised to be more predictable than extra tropical climate as it is largely forced by 'external' slowly varying forcing and less sensitive to initial conditions. However, the Indian summer monsoon is an exception within the tropics where 'internal' low frequency (LF) oscillations seem to make significant contribution to its interannual variability (IAV) and makes it sensitive to initial conditions. Quantitative estimate of contribution of 'internal' dynamics to IAV of Indian monsoon is made using long experiments with an atmospheric general circulation model (AGCM) and through analysis of long daily observations. Both AGCM experiments and observations indicate that more than 50% of IAV of the monsoon is contributed by 'internal' dynamics making the predictable signal (external component) burried in unpredictable noise (internal component) of comparable amplitude. Better understanding of the nature of the 'internal' LF variability is crucial for any improvement in predicition of seasonal mean monsoon. Nature of 'internal' LF variability of the monsoon and mechanism responsible for it are investigated and shown that vigorous monsoon intraseasonal oscillations (ISO's) with time scale between 10-70 days are primarily responsible for generating the 'internal' IAV. The monsoon ISO's do this through scale interactions with synoptic disturbances (1-7 day time scale) on one hand and the annual cycle on the other. The spatial structure of the monsoon ISO's is similar to that of the seasonal mean. It is shown that frequency of occurance of strong (weak) phases of the ISO is different in different seasons giving rise to stronger (weaker) than normal monsoon. Change in the large scale circulation during strong (weak) phases of the ISO make it favourable (inhibiting) for cyclogenesis and gives rise to space time clustering of synoptic activity. This process leads to enhanced (reduced) rainfall in seasons of higher frequency of occurence strong (weak) phases of monsoon ISO.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we present a massively parallel open source solver for Richards equation, named the RichardsFOAM solver. This solver has been developed in the framework of the open source generalist computational fluid dynamics tool box OpenFOAM (R) and is capable to deal with large scale problems in both space and time. The source code for RichardsFOAM may be downloaded from the CPC program library website. It exhibits good parallel performances (up to similar to 90% parallel efficiency with 1024 processors both in strong and weak scaling), and the conditions required for obtaining such performances are analysed and discussed. These performances enable the mechanistic modelling of water fluxes at the scale of experimental watersheds (up to few square kilometres of surface area), and on time scales of decades to a century. Such a solver can be useful in various applications, such as environmental engineering for long term transport of pollutants in soils, water engineering for assessing the impact of land settlement on water resources, or in the study of weathering processes on the watersheds. (C) 2014 Elsevier B.V. All rights reserved.