15 resultados para Methodology Article
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Does shareholder value orientation lead to shareholder value creation? This article proposes methods to quantify both, shareholder value orientation and shareholder value creation. Through the application of these models it is possible to quantify both dimensions and examine statistically in how far shareholder value orientation explains shareholder value creation. The scoring model developed in this paper allows quantifying the orientation of managers towards the objective to maximize wealth of shareholders. The method evaluates information that comes from the companies and scores the value orientation in a scale from 0 to 10 points. Analytically the variable value orientation is operationalized expressing it as the general attitude of managers toward the objective of value creation, investment policy and behavior, flexibility and further eight value drivers. The value creation model works with market data such as stock prices and dividend payments. Both methods where applied to a sample of 38 blue chip companies: 32 firms belonged to the share index IBEX 35 on July 1st, 1999, one company represents the “new economy” listed in the Spanish New Market as per July 1st, 2001, and 5 European multinational groups formed part of the EuroStoxx 50 index also on July 1st, 2001. The research period comprised the financial years 1998, 1999, and 2000. A regression analysis showed that between 15.9% and 23.4% of shareholder value creation can be explained by shareholder value orientation.
Resumo:
One feature of the modern nutrition transition is the growing consumption of animal proteins. The most common approach in the quantitative analysis of this change used to be the study of averages of food consumption. But this kind of analysis seems to be incomplete without the knowledge of the number of consumers. Data about consumers are not usually published in historical statistics. This article introduces a methodological approach for reconstructing consumer populations. This methodology is based on some assumptions about the diffusion process of foodstuffs and the modeling of consumption patterns with a log-normal distribution. This estimating process is illustrated with the specific case of milk consumption in Spain between 1925 and 1981. These results fit quite well with other data and indirect sources available showing that this dietary change was a slow and late process. The reconstruction of consumer population could shed a new light in the study of nutritional transitions.
Resumo:
Proposes a behavior-based scheme for high-level control of autonomous underwater vehicles (AUVs). Two main characteristics can be highlighted in the control scheme. Behavior coordination is done through a hybrid methodology, which takes in advantages of the robustness and modularity in competitive approaches, as well as optimized trajectories
Resumo:
When underwater vehicles navigate close to the ocean floor, computer vision techniques can be applied to obtain motion estimates. A complete system to create visual mosaics of the seabed is described in this paper. Unfortunately, the accuracy of the constructed mosaic is difficult to evaluate. The use of a laboratory setup to obtain an accurate error measurement is proposed. The system consists on a robot arm carrying a downward looking camera. A pattern formed by a white background and a matrix of black dots uniformly distributed along the surveyed scene is used to find the exact image registration parameters. When the robot executes a trajectory (simulating the motion of a submersible), an image sequence is acquired by the camera. The estimated motion computed from the encoders of the robot is refined by detecting, to subpixel accuracy, the black dots of the image sequence, and computing the 2D projective transform which relates two consecutive images. The pattern is then substituted by a poster of the sea floor and the trajectory is executed again, acquiring the image sequence used to test the accuracy of the mosaicking system
Resumo:
It has been shown that the accuracy of mammographic abnormality detection methods is strongly dependent on the breast tissue characteristics, where a dense breast drastically reduces detection sensitivity. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. Here, we describe the development of an automatic breast tissue classification methodology, which can be summarized in a number of distinct steps: 1) the segmentation of the breast area into fatty versus dense mammographic tissue; 2) the extraction of morphological and texture features from the segmented breast areas; and 3) the use of a Bayesian combination of a number of classifiers. The evaluation, based on a large number of cases from two different mammographic data sets, shows a strong correlation ( and 0.67 for the two data sets) between automatic and expert-based Breast Imaging Reporting and Data System mammographic density assessment
Resumo:
The space and time discretization inherent to all FDTD schemesintroduce non-physical dispersion errors, i.e. deviations ofthe speed of sound from the theoretical value predicted bythe governing Euler differential equations. A generalmethodologyfor computing this dispersion error via straightforwardnumerical simulations of the FDTD schemes is presented.The method is shown to provide remarkable accuraciesof the order of 1/1000 in a wide variety of twodimensionalfinite difference schemes.
Resumo:
The emergence of the Web 2.0 technologies in the last years havechanged the way people interact with knowledge. Services for cooperation andcollaboration have placed the user in the centre of a new knowledge buildingspace. The development of new second generation learning environments canbenefit from the potential of these Web 2.0 services when applied to aneducational context. We propose a methodology for designing learningenvironments that relates Web 2.0 services with the functional requirements ofthese environments. In particular, we concentrate on the design of the KRSMsystem to discuss the components of this methodology and its application.
Resumo:
The alignment between competences, teaching-learning methodologies and assessment is a key element of the European Higher Education Area. This paper presents the efforts carried out by six Telematics, Computer Science and Electronic Engineering Education teachers towards achieving this alignment in their subjects. In a joint work with pedagogues, a set of recommended actions were identified. A selection of these actions were applied and evaluated in the six subjects. The cross-analysis of the results indicate that the actions allow students to better understand the methodologies and assessment planned for the subjects, facilitate (self-) regulation and increase students’ involvement in the subjects.
Resumo:
During the period 1996-2000, forty-three heavy rainfall events have been detected in the Internal Basins of Catalonia (Northeastern of Spain). Most of these events caused floods and serious damage. This high number leads to the need for a methodology to classify them, on the basis of their surface rainfall distribution, their internal organization and their physical features. The aim of this paper is to show a methodology to analyze systematically the convective structures responsible of those heavy rainfall events on the basis of the information supplied by the meteorological radar. The proposed methodology is as follows. Firstly, the rainfall intensity and the surface rainfall pattern are analyzed on the basis of the raingauge data. Secondly, the convective structures at the lowest level are identified and characterized by using a 2-D algorithm, and the convective cells are identified by using a 3-D procedure that looks for the reflectivity cores in every radar volume. Thirdly, the convective cells (3-D) are associated with the 2-D structures (convective rainfall areas). This methodology has been applied to the 43 heavy rainfall events using the meteorological radar located near Barcelona and the SAIH automatic raingauge network.
Resumo:
Two trends which presently exist in relation to the concept of Paleontology are analyzed, pointing out some of the aspects which negative influence. Various reflections are made based on examples of some of the principal points of paleontological method, such as the influence of a punctual sampling, the meaning of size-frequency distribution and subjectivity in the identification of fossils. Topics which have a marked repercussion in diverse aspects of Paleontology are discussed.
Resumo:
The following paper introduces a new approach to the analysis of offensive game in football. Therefore, the main aim of this study was to create an instrument for collecting information for the analysis of offensive action and interactions game. The observation instrument that was used to accomplish the main objective of this work consists of a combination of format fields (FC) and systems of categories (SC). This methodology is a particular strategy of the scientific method that has as an objective to analyse the perceptible behaviour that occurs in habitual contexts, allowing them to be formally recorded and quantified and using an ad hoc instrument in order to obtain a behaviour systematic registration that, since they have been transformed in quantitative data with the necessary reliability and validity determined level, will allow analysis of the relations between these behaviours. The codifications undertaken to date in various games of football have shown that it serves the purposes for which it was developed, allowing more research into the offensive game methods in football.
Resumo:
This article presents an optimization methodology of batch production processes assembled by shared resources which rely on a mapping of state-events into time-events allowing in this way the straightforward use of a well consolidated scheduling policies developed for manufacturing systems. A technique to generate the timed Petri net representation from a continuous dynamic representation (Differential-Algebraic Equations systems (DAEs)) of the production system is presented together with the main characteristics of a Petri nets-based tool implemented for optimization purposes. This paper describes also how the implemented tool generates the coverability tree and how it can be pruned by a general purpose heuristic. An example of a distillation process with two shared batch resources is used to illustrate the optimization methodology proposed.
Resumo:
This paper describes an evaluation framework that allows a standardized and quantitative comparison of IVUS lumen and media segmentation algorithms. This framework has been introduced at the MICCAI 2011 Computing and Visualization for (Intra)Vascular Imaging (CVII) workshop, comparing the results of eight teams that participated. We describe the available data-base comprising of multi-center, multi-vendor and multi-frequency IVUS datasets, their acquisition, the creation of the reference standard and the evaluation measures. The approaches address segmentation of the lumen, the media, or both borders; semi- or fully-automatic operation; and 2-D vs. 3-D methodology. Three performance measures for quantitative analysis have been proposed. The results of the evaluation indicate that segmentation of the vessel lumen and media is possible with an accuracy that is comparable to manual annotation when semi-automatic methods are used, as well as encouraging results can be obtained also in case of fully-automatic segmentation. The analysis performed in this paper also highlights the challenges in IVUS segmentation that remains to be solved.
Resumo:
A method to generate carbonylic compounds from alkynes under mild and neutral conditions, with excellent functional group compatibility and high yields, is described. Hydration takes place under catalytic conditions by using from 0.1 to 0.2 equivalents of the easily available and inexpensive mercury(II) p-toluensulfonamidate in a hydroalcoholic solution. After use the catalyst is iner tized and/or recycled ...
Resumo:
Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective: We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application.Data sources and extraction: A search was made of the MEDLINE and EMBASE databases.Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis: The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level.Conclusions: The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use