36 resultados para business methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Until recently, much of the discussion regarding the type of organization theory needed in management studies focused on normative vs. descriptive roles of management science. Some authors however noticed that even a descriptive theory can have a normative impact. Among others, management theories are used by practitioners to make sense of their identity and roles in given contexts, and so guide their attitude, decision process, and behavior. The sensemaking potential of a theory might in this view represent an important element for predicting the adoption of a theory by practitioners. Accordingly, theories are needed which better grasp the increased complexity of today's business environment in order to be more relevant for practitioners. This article proposes a multi-faceted perspective of organizations. This implies leaving a simplistic view of organizations and building a 'cubist' conception. Picasso's cubism paintings are characterized by the use of multiple perspectives within a single drawing. Similarly, I argue here that managers must learn not only to add multiple responsibilities in their work, but to develop an integrated conception of their managerial identity and of their organizations in which the multiple social and economic dimensions are enmeshed. Social entrepreneurship is discussed as illustration of typical multi-faceted business.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent advances in sequencing technologies have given all microbiology laboratories access to whole genome sequencing. Providing that tools for the automated analysis of sequence data and databases for associated meta-data are developed, whole genome sequencing will become a routine tool for large clinical microbiology laboratories. Indeed, the continuing reduction in sequencing costs and the shortening of the 'time to result' makes it an attractive strategy in both research and diagnostics. Here, we review how high-throughput sequencing is revolutionizing clinical microbiology and the promise that it still holds. We discuss major applications, which include: (i) identification of target DNA sequences and antigens to rapidly develop diagnostic tools; (ii) precise strain identification for epidemiological typing and pathogen monitoring during outbreaks; and (iii) investigation of strain properties, such as the presence of antibiotic resistance or virulence factors. In addition, recent developments in comparative metagenomics and single-cell sequencing offer the prospect of a better understanding of complex microbial communities at the global and individual levels, providing a new perspective for understanding host-pathogen interactions. Being a high-resolution tool, high-throughput sequencing will increasingly influence diagnostics, epidemiology, risk management, and patient care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ACuteTox is a project within the 6th European Framework Programme which had as one of its goals to develop, optimise and prevalidate a non-animal testing strategy for predicting human acute oral toxicity. In its last 6 months, a challenging exercise was conducted to assess the predictive capacity of the developed testing strategies and final identification of the most promising ones. Thirty-two chemicals were tested blind in the battery of in vitro and in silico methods selected during the first phase of the project. This paper describes the classification approaches studied: single step procedures and two step tiered testing strategies. In summary, four in vitro testing strategies were proposed as best performing in terms of predictive capacity with respect to the European acute oral toxicity classification. In addition, a heuristic testing strategy is suggested that combines the prediction results gained from the neutral red uptake assay performed in 3T3 cells, with information on neurotoxicity alerts identified by the primary rat brain aggregates test method. Octanol-water partition coefficients and in silico prediction of intestinal absorption and blood-brain barrier passage are also considered. This approach allows to reduce the number of chemicals wrongly predicted as not classified (LD50>2000 mg/kg b.w.).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business cycle theory is normally described as having evolved out of a previous tradition of writers focusing exclusively on crises. In this account, the turning point is seen as residing in Clément Juglar's contribution on commercial crises and their periodicity. It is well known that the champion of this view is Schumpeter, who propagated it on several occasions. The same author, however, pointed to a number of other writers who, before and at the same time as Juglar, stressed one or another of the aspects for which Juglar is credited primacy, including the recognition of periodicity and the identification of endogenous elements enabling the recognition of crises as a self-generating phenomenon. There is indeed a vast literature, both primary and secondary, relating to the debates on crises and fluctuations around the middle of the nineteenth century, from which it is apparent that Juglar's book Des Crises Commerciales et de leur Retour Périodique en France, en Angleterre et aux États-Unis (originally published in 1862 and very much revised and enlarged in 1889) did not come out of the blue but was one of the products of an intellectual climate inducing the thinking of crises not as unrelated events but as part of a more complex phenomenon consisting of recurring crises related to the development of the commercial world - an interpretation corroborated by the almost regular occurrence of crises at about 10-year intervals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For several decades mechanical properties of shallow formations (soil) obtained by sonic to ultrasonic wave testing were reported to be greater than those based on mechanical tests. The present article relying on a statistical analysis of more than 300 tests shows that elastic moduli of the soil can indeed be obtained from (ultra)sonic tests and that they are identical to those resulting from mechanical tests.