12 resultados para resource-based view of the firm

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Farms and rural areas have many specific valuable resources that can be used to create non-agricultural products and services. Most of the research regarding on-farm diversification has hitherto concentrated on business start-up or farm survival strategies. Resource allocation and also financial success have not been the primary focus of investigations as yet. In this study these specific topics were investigated i.e. resource allocation and also the financial success of diversified farms from a farm management perspective. The key question addressed in this dissertation, is how tangible and intangible resources of the diversified farm affect the financial success. This study’s theoretical background deals with resource-based theory, and also certain themes of the theory of learning organisation and other decision-making theories. Two datasets were utilised in this study. First, data were collected by postal survey in 2001 (n = 663). Second, data were collected in a follow-up survey in 2006 (n = 439). Data were analysed using multivariate data analyses and path analyses. The study results reveal that, diversified farms performed differently. Success and resources were linked. Professional and management skills affected other resources, and hence directly or indirectly influenced success per se. In the light of empirical analyses of this study, tangible and intangible resources owned by the diversified farm impacted on its financial success. The findings of this study underline the importance of skills and networks for entrepreneur(s). Practically speaking all respondents of this study used either agricultural resources for non-farm businesses or non-farm resources for agricultural enterprises. To share resources in this way was seen as a pragmatic opportunity recognised by farmers. One of the downsides of diversification might be the phenomenon of over-diversification, which can be defined as the situation in which a farm diversifies beyond its optimal limit. The empirical findings of this study reveal that capital and labour resource constrains did have adverse effects on financial success. The evidence indicates that farms that were capital and labour resource constrained in 2001 were still less profitable than their ‘no problems’ counterparts five years later.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to extend understanding of how large firms pursuing sustained and profitable growth manage organisational renewal. A multiple-case study was conducted in 27 North American and European wood-industry companies, of which 11 were chosen for closer study. The study combined the organisational-capabilities approach to strategic management with corporate-entrepreneurship thinking. It charted the further development of an identification and classification system for capabilities comprising three dimensions: (i) the dynamism between firm-specific and industry-significant capabilities, (ii) hierarchies of capabilities and capability portfolios, and (iii) their internal structure. Capability building was analysed in the context of the organisational design, the technological systems and the type of resource-bundling process (creating new vs. entrenching existing capabilities). The thesis describes the current capability portfolios and the organisational changes in the case companies. It also clarifies the mechanisms through which companies can influence the balance between knowledge search and the efficiency of knowledge transfer and integration in their daily business activities, and consequently the diversity of their capability portfolio and the breadth and novelty of their product/service range. The largest wood-industry companies of today must develop a seemingly dual strategic focus: they have to combine leading-edge, innovative solutions with cost-efficient, large-scale production. The use of modern technology in production was no longer a primary source of competitiveness in the case companies, but rather belonged to the portfolio of basic capabilities. Knowledge and information management had become an industry imperative, on a par with cost effectiveness. Yet, during the period of this research, the case companies were better in supporting growth in volume of the existing activity than growth through new economic activities. Customer-driven, incremental innovation was preferred over firm-driven innovation through experimentation. The three main constraints on organisational renewal were the lack of slack resources, the aim for lean, centralised designs, and the inward-bound communication climate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data assimilation provides an initial atmospheric state, called the analysis, for Numerical Weather Prediction (NWP). This analysis consists of pressure, temperature, wind, and humidity on a three-dimensional NWP model grid. Data assimilation blends meteorological observations with the NWP model in a statistically optimal way. The objective of this thesis is to describe methodological development carried out in order to allow data assimilation of ground-based measurements of the Global Positioning System (GPS) into the High Resolution Limited Area Model (HIRLAM) NWP system. Geodetic processing produces observations of tropospheric delay. These observations can be processed either for vertical columns at each GPS receiver station, or for the individual propagation paths of the microwave signals. These alternative processing methods result in Zenith Total Delay (ZTD) and Slant Delay (SD) observations, respectively. ZTD and SD observations are of use in the analysis of atmospheric humidity. A method is introduced for estimation of the horizontal error covariance of ZTD observations. The method makes use of observation minus model background (OmB) sequences of ZTD and conventional observations. It is demonstrated that the ZTD observation error covariance is relatively large in station separations shorter than 200 km, but non-zero covariances also appear at considerably larger station separations. The relatively low density of radiosonde observing stations limits the ability of the proposed estimation method to resolve the shortest length-scales of error covariance. SD observations are shown to contain a statistically significant signal on the asymmetry of the atmospheric humidity field. However, the asymmetric component of SD is found to be nearly always smaller than the standard deviation of the SD observation error. SD observation modelling is described in detail, and other issues relating to SD data assimilation are also discussed. These include the determination of error statistics, the tuning of observation quality control and allowing the taking into account of local observation error correlation. The experiments made show that the data assimilation system is able to retrieve the asymmetric information content of hypothetical SD observations at a single receiver station. Moreover, the impact of real SD observations on humidity analysis is comparable to that of other observing systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The modern subject is what we can call a self-subjecting individual. This is someone in whose inner reality has been implanted a more permanent governability, a governability that works inside the agent. Michel Foucault s genealogy of the modern subject is the history of its constitution by power practices. By a flight of imagination, suppose that this history is not an evolving social structure or cultural phenomenon, but one of those insects (moth) whose life cycle consists of three stages or moments: crawling larva, encapsulated pupa, and flying adult. Foucault s history of power-practices presents the same kind of miracle of total metamorphosis. The main forces in the general field of power can be apprehended through a generalisation of three rationalities functioning side-by-side in the plurality of different practices of power: domination, normalisation and the law. Domination is a force functioning by the rationality of reason of state: the state s essence is power, power is firm domination over people, and people are the state s resource by which the state s strength is measured. Normalisation is a force that takes hold on people from the inside of society: it imposes society s own reality its empirical verity as a norm on people through silently working jurisdictional operations that exclude pathological individuals too far from the average of the population as a whole. The law is a counterforce to both domination and normalisation. Accounting for elements of legal practice as omnihistorical is not possible without a view of the general field of power. Without this view, and only in terms of the operations and tactical manoeuvres of the practice of law, nothing of the kind can be seen: the only thing that practice manifests is constant change itself. However, the backdrop of law s tacit dimension that is, the power-relations between law, domination and normalisation allows one to see more. In the general field of power, the function of law is exactly to maintain the constant possibility of change. Whereas domination and normalisation would stabilise society, the law makes it move. The European individual has a reality as a problem. What is a problem? A problem is something that allows entry into the field of thought, said Foucault. To be a problem, it is necessary for certain number of factors to have made it uncertain, to have made it lose familiarity, or to have provoked a certain number of difficulties around it . Entering the field of thought through problematisations of the European individual human forms, power and knowledge one is able to glimpse the historical backgrounds of our present being. These were produced, and then again buried, in intersections between practices of power and games of truth. In the problem of the European individual one has suitable circumstances that bring to light forces that have passed through the individual through centuries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ERP system implementations have evolved so rapidly that now they represent a must-have within industries. ERP systems are viewed as the cost of doing business. Yet, the research that adopted the resource-based view on the business value of ERP systems concludes that companies may gain competitive advantage when they successfully manage their ERP projects, when they carefully reengineer the organization and when they use the system in line with the organizational strategies. This thesis contributes to the literature on ERP business value by examining key drivers of ERP business value in organizations. The first research paper investigates how ERP systems with different degrees of system functionality are correlated with the development of the business performance after the completion of the ERP projects. The companies with a better perceived system functionality obtained efficiency benefits in the first two years of post-implementation. However, in the third year there is no significant difference in efficiency benefits between successfully and less successfully managed ERP projects. The second research paper examines what business process changes occur in companies implementing ERP for different motivations and how these changes impact the business performance. The findings show that companies reported process changes mainly in terms of workflow changes. In addition, the companies having a business-led motivation focused more on observing average costs of each increase in the input unit. Companies having a technological-led motivation focused more on the benefits coming from the fit of the system with the organizational processes. The third research paper considers the role of alignment between ERP and business strategies for the realization of business value from ERP use. These findings show that strategic alignment and business process changes are significantly correlated with the perceived benefits of ERP at three levels: internal efficiency, customers and financial. Overall, by combining quantitative and qualitative research methods, this thesis puts forward a model that illustrates how successfully managed ERP projects, aligned with the business strategy, have automate and informate effects on processes that ultimately improve the customer service and reduce the companies’ costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overlapping sound pressure waves that enter our brain via the ears and auditory nerves must be organized into a coherent percept. Modelling the regularities of the auditory environment and detecting unexpected changes in these regularities, even in the absence of attention, is a necessary prerequisite for orientating towards significant information as well as speech perception and communication, for instance. The processing of auditory information, in particular the detection of changes in the regularities of the auditory input, gives rise to neural activity in the brain that is seen as a mismatch negativity (MMN) response of the event-related potential (ERP) recorded by electroencephalography (EEG). --- As the recording of MMN requires neither a subject s behavioural response nor attention towards the sounds, it can be done even with subjects with problems in communicating or difficulties in performing a discrimination task, for example, from aphasic and comatose patients, newborns, and even fetuses. Thus with MMN one can follow the evolution of central auditory processing from the very early, often critical stages of development, and also in subjects who cannot be examined with the more traditional behavioural measures of auditory discrimination. Indeed, recent studies show that central auditory processing, as indicated by MMN, is affected in different clinical populations, such as schizophrenics, as well as during normal aging and abnormal childhood development. Moreover, the processing of auditory information can be selectively impaired for certain auditory attributes (e.g., sound duration, frequency) and can also depend on the context of the sound changes (e.g., speech or non-speech). Although its advantages over behavioral measures are undeniable, a major obstacle to the larger-scale routine use of the MMN method, especially in clinical settings, is the relatively long duration of its measurement. Typically, approximately 15 minutes of recording time is needed for measuring the MMN for a single auditory attribute. Recording a complete central auditory processing profile consisting of several auditory attributes would thus require from one hour to several hours. In this research, I have contributed to the development of new fast multi-attribute MMN recording paradigms in which several types and magnitudes of sound changes are presented in both speech and non-speech contexts in order to obtain a comprehensive profile of auditory sensory memory and discrimination accuracy in a short measurement time (altogether approximately 15 min for 5 auditory attributes). The speed of the paradigms makes them highly attractive for clinical research, their reliability brings fidelity to longitudinal studies, and the language context is especially suitable for studies on language impairments such as dyslexia and aphasia. In addition I have presented an even more ecological paradigm, and more importantly, an interesting result in view of the theory of MMN where the MMN responses are recorded entirely without a repetitive standard tone. All in all, these paradigms contribute to the development of the theory of auditory perception, and increase the feasibility of MMN recordings in both basic and clinical research. Moreover, they have already proven useful in studying for instance dyslexia, Asperger syndrome and schizophrenia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Triggered by the very quick proliferation of Internet connectivity, electronic document management (EDM) systems are now rapidly being adopted for managing the documentation that is produced and exchanged in construction projects. Nevertheless there are still substantial barriers to the efficient use of such systems, mainly of a psychological nature and related to insufficient training. This paper presents the results of empirical studies carried out during 2002 concerning the current usage of EDM systems in the Finnish construction industry. The studies employed three different methods in order to provide a multifaceted view of the problem area, both on the industry and individual project level. In order to provide an accurate measurement of overall usage volume in the industry as a whole telephone interviews with key personnel from 100 randomly chosen construction projects were conducted. The interviews showed that while around 1/3 of big projects already have adopted the use of EDM, very few small projects have adopted this technology. The barriers to introduction were investigated through interviews with representatives for half a dozen of providers of systems and ASP-services. These interviews shed a lot of light on the dynamics of the market for this type of services and illustrated the diversity of business strategies adopted by vendors. In the final study log files from a project which had used an EDM system were analysed in order to determine usage patterns. The results illustrated that use is yet incomplete in coverage and that only a part of the individuals involved in the project used the system efficiently, either as information producers or consumers. The study also provided feedback on the usefulness of the log files.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most important French literary movement of the 1950s and 1960s, the nouveau roman, radically questioned the idea of the novel as storytelling, claiming that narratives create a false illusion of the world’s intelligibility. However, in the 1970s storytelling finds its way back into the French novel – a shift that has been characterized as the “return of the narrative”. In my article, I argue that the “narrative turn” in the French novel of the 1970s can be seen as a turn towards a fundamentally hermeneutic view of the narrative mediatedness of our relation to the world. From a hermeneutic perspective, the nouveaux romanciers – insofar as they reject the narrative in order to disclose the discontinuous, fragmentary and chaotic nature of reality – hang onto the positivistic idea that “real” is only that which is independent of human meaning-giving processes. By contrast, the hermeneutists, such as Paul Ricoeur, consider also the human experience of the world to be real, and largely narrative in form. This view is shared by the principal novelists associated with the narrative turn, such as Michel Tournier to whom man is a “mythological animal”. However, after the nouveau roman , narratives have lost their innocence: they no longer appear as “natural” but are conscious of their own narrativity, historicity, and the way they represent only one possible – inevitably ethically and politically charged – perspective into reality. By making storytelling thematic and by telling “counter-stories” that question prevailing models of sense-making, Tournier and other “new storytellers” strive to promote critical reflection on the stories on the basis of which we orient to the world and narrate our lives – both as individuals and as communities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drug-drug interactions may cause serious, even fatal clinical consequences. Therefore, it is important to examine the interaction potential of new chemical entities early in drug development. Mechanism-based inhibition is a pharmacokinetic interaction type, which causes irreversible loss of enzyme activity and can therefore lead to unusually profound and long-lasting consequences. The in vitro in vivo extrapolation (IVIVE) of drug-drug interactions caused by mechanism-based inhibition is challenging. Consequently, many of these interactions have remained unrecognised for many years. The concomitant use of the fibrate-class lipid-lowering agent gemfibrozil increases the concentrations of some drugs and their effects markedly. Even fatal cases of rhabdomyolysis occurred in patients administering gemfibrozil and cerivastatin concomitantly. One of the main mechanisms behind this effect is the mechanism-based inhibition of the cytochrome P450 (CYP) 2C8 enzyme by a glucuronide metabolite of gemfibrozil leading to increased cerivastatin concentrations. Although the clinical use of gemfibrozil has clearly decreased during recent years, gemfibrozil is still needed in some special cases. To enable safe use of gemfibrozil concomitantly with other drugs, information concerning the time and dose relationships of CYP2C8 inhibition by gemfibrozil should be known. This work was carried out as four in vivo clinical drug-drug interaction studies to examine the time and dose relationships of the mechanism-based inhibitory effect of gemfibrozil on CYP2C8. The oral antidiabetic drug repaglinide was used as a probe drug for measuring CYP2C8 activity in healthy volunteers. In this work, mechanism-based inhibition of the CYP2C8 enzyme by gemfibrozil was found to occur rapidly in humans. The inhibitory effect developed to its maximum already when repaglinide was given 1-3 h after gemfibrozil intake. In addition, the inhibition was shown to abate slowly. A full recovery of CYP2C8 activity, as measured by repaglinide metabolism, was achieved 96 h after cessation of gemfibrozil treatment. The dose-dependency of the mechanism-based inhibition of CYP2C8 by gemfibrozil was shown for the first time in this work. CYP2C8 activity was halved by a single 30 mg dose of gemfibrozil or by twice daily administration of less than 30 mg of gemfibrozil. Furthermore, CYP2C8 activity was decreased over 90% by a single dose of 900 mg gemfibrozil or twice daily dosing of approximately 100 mg gemfibrozil. In addition, with the application of physiological models to the data obtained in the dose-dependency studies, the major role of mechanism-based inhibition of CYP2C8 in the interaction between gemfibrozil and repaglinide was confirmed. The results of this work enhance the proper use of gemfibrozil and the safety of patients. The information related to time-dependency of CYP2C8 inhibition by gemfibrozil may also give new insights in order to improve the IVIVE of the drug-drug interactions of new chemical entities. The information obtained by this work may be utilised also in the design of clinical drug-drug interaction studies in the future.