82 resultados para usefulness
Resumo:
We use best practice benchmarking rationales to propose a dynamic research design that accounts for the endogenous components of across-firms heterogeneous routines to study changes in performance and their link to organizational knowledge investments. We thus contribute to the operationalization of management theoretical frameworks based on resources and routines. The research design employs frontier measures that provide industry-level benchmarking in organizational settings, and proposes some new indicators for firm-level strategic benchmarking. A profit-oriented analysis of the U.S. technology industry during 2000-2011 illustrates the usefulness of our design. Findings reveal that industry revival following economic distress comes along with wider gaps between best and worst performers. Second stage analyses show that increasing intangibles stocks is positively associated with fixed target benchmarking, while enhancing R&D spending is linked to local frontier progress. The discussion develops managerial interpretations of the benchmarking measures that are suitable for control mechanisms and reward systems.
Resumo:
This paper describes the development and applications of a super-resolution method, known as Super-Resolution Variable-Pixel Linear Reconstruction. The algorithm works combining different lower resolution images in order to obtain, as a result, a higher resolution image. We show that it can make significant spatial resolution improvements to satellite images of the Earth¿s surface allowing recognition of objects with size approaching the limiting spatial resolution of the lower resolution images. The algorithm is based on the Variable-Pixel Linear Reconstruction algorithm developed by Fruchter and Hook, a well-known method in astronomy but never used for Earth remote sensing purposes. The algorithm preserves photometry, can weight input images according to the statistical significance of each pixel, and removes the effect of geometric distortion on both image shape and photometry. In this paper, we describe its development for remote sensing purposes, show the usefulness of the algorithm working with images as different to the astronomical images as the remote sensing ones, and show applications to: 1) a set of simulated multispectral images obtained from a real Quickbird image; and 2) a set of multispectral real Landsat Enhanced Thematic Mapper Plus (ETM+) images. These examples show that the algorithm provides a substantial improvement in limiting spatial resolution for both simulated and real data sets without significantly altering the multispectral content of the input low-resolution images, without amplifying the noise, and with very few artifacts.
Resumo:
This article has an immediate predecessor, upon which it is based and with which readers must necessarily be familiar: Towards a Theory of the Credit-Risk Balance Sheet (Vallverdú, Somoza and Moya, 2006). The Balance Sheet is conceptualised on the basis of the duality of a credit-based transaction; it deals with its theoretical foundations, providing evidence of a causal credit-risk duality, that is, a true causal relationship; its characteristics, properties and its static and dynamic characteristics are analyzed. This article, which provides a logical continuation to the previous one, studies the evolution of the structure of the Credit-Risk Balance Sheet as a consequence of a business¿s dynamics in the credit area. Given the Credit-Risk Balance Sheet of a company at any given time, it attempts to estimate, by means of sequential analysis, its structural evolution, showing its usefulness in the management and control of credit and risk. To do this, it bases itself, with the necessary adaptations, on the by-now classic works of Palomba and Cutolo. The establishment of the corresponding transformation matrices allows one to move from an initial balance sheet structure to a final, future one, to understand its credit-risk situation trends, as well as to make possible its monitoring and control, basic elements in providing support for risk management.
Resumo:
This article has an immediate predecessor, upon which it is based and with which readers must necessarily be familiar: Towards a Theory of the Credit-Risk Balance Sheet (Vallverdú, Somoza and Moya, 2006). The Balance Sheet is conceptualised on the basis of the duality of a credit-based transaction; it deals with its theoretical foundations, providing evidence of a causal credit-risk duality, that is, a true causal relationship; its characteristics, properties and its static and dynamic characteristics are analyzed. This article, which provides a logical continuation to the previous one, studies the evolution of the structure of the Credit-Risk Balance Sheet as a consequence of a business¿s dynamics in the credit area. Given the Credit-Risk Balance Sheet of a company at any given time, it attempts to estimate, by means of sequential analysis, its structural evolution, showing its usefulness in the management and control of credit and risk. To do this, it bases itself, with the necessary adaptations, on the by-now classic works of Palomba and Cutolo. The establishment of the corresponding transformation matrices allows one to move from an initial balance sheet structure to a final, future one, to understand its credit-risk situation trends, as well as to make possible its monitoring and control, basic elements in providing support for risk management.
Resumo:
We study energy-weighted sum rules of the pion and kaon propagator in nuclear matter at finite temperature. The sum rules are obtained from matching the Dyson form of the meson propagator with its spectral Lehmann representation at low and high energies. We calculate the sum rules for specific models of the kaon and pion self-energy. The in-medium spectral densities of the K and (K) over bar mesons are obtained from a chiral unitary approach in coupled channels that incorporates the S and P waves of the kaon-nucleon interaction. The pion self-energy is determined from the P-wave coupling to particle-hole and Delta-hole excitations, modified by short-range correlations. The sum rules for the lower-energy weights are fulfilled satisfactorily and reflect the contributions from the different quasiparticle and collective modes of the meson spectral function. We discuss the sensitivity of the sum rules to the distribution of spectral strength and their usefulness as quality tests of model calculations.
Resumo:
Background: Limited data on a short series of patients suggest that lymphocytic enteritis (classically considered as latent coeliac disease) may produce symptoms of malabsorption, although the true prevalence of this situation is unknown. Serological markers of coeliac disease are of little diagnostic value in identifying these patients. Aims: To evaluate the usefulness of human leucocyte antigen-DQ2 genotyping followed by duodenal biopsy for the detection of gluten-sensitive enteropathy in first-degree relatives of patients with coeliac disease and to assess the clinical relevance of lymphocytic enteritis diagnosed with this screening strategy. Patients and methods: 221 first-degree relatives of 82 DQ2+ patients with coeliac disease were consecutively included. Duodenal biopsy (for histological examination and tissue transglutaminase antibody assay in culture supernatant) was carried out on all DQ2+ relatives. Clinical features, biochemical parameters and bone mineral density were recorded. Results: 130 relatives (58.8%) were DQ2+, showing the following histological stages: 64 (49.2%) Marsh 0; 32 (24.6%) Marsh I; 1 (0.8%) Marsh II; 13 (10.0%) Marsh III; 15.4% refused the biopsy. 49 relatives showed gluten sensitive enteropathy, 46 with histological abnormalities and 3 with Marsh 0 but positive tissue transglutaminase antibody in culture supernatant. Only 17 of 221 relatives had positive serological markers. Differences in the diagnostic yield between the proposed strategy and serology were significant (22.2% v 7.2%, p<0.001). Relatives with Marsh I and Marsh II¿III were more often symptomatic (56.3% and 53.8%, respectively) than relatives with normal mucosa (21.1%; p=0.002). Marsh I relatives had more severe abdominal pain (p=0.006), severe distension (p=0.047) and anaemia (p=0.038) than those with Marsh 0. The prevalence of abnormal bone mineral density was similar in relatives with Marsh I (37%) and Marsh III (44.4%). Conclusions: The high number of symptomatic patients with lymphocytic enteritis (Marsh I) supports the need for a strategy based on human leucocyte antigen-DQ2 genotyping followed by duodenal biopsy in relatives of patients with coeliac disease and modifies the current concept that villous atrophy is required to prescribe a gluten-free diet.
Resumo:
Background: Hospitals in countries with public health systems have recently adopted organizational changes to improve efficiency and resource allocation, and reducing inappropriate hospitalizations has been established as an important goal. AIMS: Our goal was to describe the functioning of a Quick Diagnosis Unit in a Spanish public university hospital after evaluating 1,000 consecutive patients. We also aimed to ascertain the degree of satisfaction among Quick Diagnosis Unit patients and the costs of the model compared to conventional hospitalization practices. DESIGN: Observational, descriptive study. METHODS: Our sample comprised 1,000 patients evaluated between November 2008 and January 2010 in the Quick Diagnosis Unit of a tertiary university public hospital in Barcelona. Included patients were those who had potentially severe diseases and would normally require hospital admission for diagnosis but whose general condition allowed outpatient treatment. We analyzed several variables, including time to diagnosis, final diagnoses and hospitalizations avoided, and we also investigated the mean cost (as compared to conventional hospitalization) and the patients' satisfaction. RESULTS: In 88% of cases, the reasons for consultation were anemia, anorexia-cachexia syndrome, febrile syndrome, adenopathies, abdominal pain, chronic diarrhea and lung abnormalities. The most frequent diagnoses were cancer (18.8%; mainly colon cancer and lymphoma) and Iron-deficiency anemia (18%). The mean time to diagnosis was 9.2 days (range 1 to 19 days). An estimated 12.5 admissions/day in a one-year period (in the internal medicine department) were avoided. In a subgroup analysis, the mean cost per process (admission-discharge) for a conventional hospitalization was 3,416.13 Euros, while it was 735.65 Euros in the Quick Diagnosis Unit. Patients expressed a high degree of satisfaction with Quick Diagnosis Unit care. CONCLUSIONS: Quick Diagnosis Units represent a useful and cost-saving model for the diagnostic study of patients with potentially severe diseases. Future randomized study designs involving comparisons between controls and intervention groups would help elucidate the usefulness of Quick Diagnosis Units as an alternative to conventional hospitalization.
Resumo:
A novel and simple procedure for concentrating adenoviruses from seawater samples is described. The technique entails the adsorption of viruses to pre-flocculated skimmed milk proteins, allowing the flocs to sediment by gravity, and dissolving the separated sediment in phosphate buffer. Concentrated virus may be detected by PCR techniques following nucleic acid extraction. The method requires no specialized equipment other than that usually available in routine public health laboratories, and due to its straightforwardness it allows the processing of a larger number of water samples simultaneously. The usefulness of the method was demonstrated in concentration of virus in multiple seawater samples during a survey of adenoviruses in coastal waters.
Resumo:
En este trabajo se describen la teoría de los conjuntos borrosos de L. A. Zadeh(antecedentes, características e implicaciones) y las áreas en las que se ha aplicado laborrosidad en psicología y psicología social (desarrollo evolutivo, procesamiento deestímulos, percepción de la información, prototipos y otras aplicaciones). A partir de esto,se sugiere cómo la borrosidad podría ser útil en el estudio de la interacción social,asumiendo el carácter simultáneamente vago y preciso de la realidad, y la utilización deconceptos como la noción de sí mismo desde una visión compleja, que considere, desde laperspectiva del pluralismo, diversas posturas teóricas y metodológicas.
Resumo:
This study aimed to investigate the behaviour of two indicators of influenza activity in the area of Barcelona and to evaluate the usefulness of modelling them to improve the detection of influenza epidemics. DESIGN: Descriptive time series study using the number of deaths due to all causes registered by funeral services and reported cases of influenza-like illness. The study concentrated on five influenza seasons, from week 45 of 1988 to week 44 of 1993. The weekly number of deaths and cases of influenza-like illness registered were processed using identification of a time series ARIMA model. SETTING: Six large towns in the Barcelona province which have more than 60,000 inhabitants and funeral services in all of them. MAIN RESULTS: For mortality, the proposed model was an autoregressive one of order 2 (ARIMA (2,0,0)) and for morbidity it was one of order 3 (ARIMA (3,0,0)). Finally, the two time series were analysed together to facilitate the detection of possible implications between them. The joint study of the two series shows that the mortality series can be modelled separately from the reported morbidity series, but the morbidity series is influenced as much by the number of previous cases of influenza reported as by the previous mortality registered. CONCLUSIONS: The model based on general mortality is useful for detecting epidemic activity of influenza. However, because there is not an absolute gold standard that allows definition of the beginning of the epidemic, the final decision of when it is considered an epidemic and control measures recommended should be taken after evaluating all the indicators included in the influenza surveillance programme.
Resumo:
This article was delivered as an area-paper to the Critical Political Science Meeting of Bilbao, November the 15th 2008, which was organized by the Political Science Department of the UPV (University of the Basque Country). The paper introduces an updated and synthetic version of the model designed by S.M. Lipset and S. Rokkan in 1967 in order to identify the confrontational divides distinctive of European modernization and, in this way, trace the origins of modern party systems. The expanded model proposed is applied, on the one hand, to a variety of empirical cases, prominently the postransitional Spanish case; and on the other, shows its usefulness in order to better understand the distinctive structure of the social conflict of the globalization era.
Resumo:
La comedia Pigmalión, que Bernard Shaw escribió en 1912 y estrenó al año siguiente, es una proclamación artística de la lingüística. A pesar de la popularidad de la obra, es un acontecimiento que se conoce como una anécdota extravagante. El extraordinario mérito de Pigmalión ha pasado desapercibido para la historia de la lingüística. Su mérito estriba en la calidad teatral, la capacidad prospectiva de Shaw y la intención social de su mensaje. Shaw anuncia la utilidad de la lingüística en facetas que, décadas después, se conocerán como sociolingüística, planificación lingüística y logopedia. Por otra parte, se ha relacionado de modo simplista el personaje del profesor Higgins al fonetista H. Sweet. The play Pygmalion, that Bernard Shaw wrote in 1912 and released the following year, is an artistic proclamation of Linguistics. Despite the popularity of the play, this event is known as an extravagant story. The History of Linguistics does not recognize the extraordinary merit of Pygmalion. Its merit lays in the theatrical quality ant the social intention of the message. Shaw announces the usefulness of Linguistics in facets that will be known as Sociolinguistics, Language Planning and Speech Therapy. And furthermore, in a simplistic way, the character of Professor Higgins has been linked to the phonetician H. Sweet.
Resumo:
The aim of this study was to assess the usefulness of virtual environments representing situations that are emotionally significant to subjects with eating disorders (ED). These environments may be applied with both evaluative and therapeutic aims and in simulation procedures to carry out a range of experimental studies. This paper is part of a wider research project analyzing the influence of the situation to which subjects are exposed on their performance on body image estimation tasks. Thirty female patients with eating disorders were exposed to six virtual environments: a living-room (neutral situation), a kitchen with highcalorie food, a kitchen with low-calorie food, a restaurant with high-calorie food, a restaurant with low-calorie food, and a swimming-pool. After exposure to each environment the STAI-S (a measurement of state anxiety) and the CDB (a measurement of depression) were administered to all subjects. The results show that virtual reality instruments are particularly useful for simulating everyday situations that may provoke emotional reactions such as anxiety and depression, in patients with ED. Virtual environments in which subjects are obliged to ingest high-calorie food provoke the highest levels of state anxiety and depression.
Resumo:
We propose new methods for evaluating predictive densities. The methods includeKolmogorov-Smirnov and Cram?r-von Mises-type tests for the correct specification ofpredictive densities robust to dynamic mis-specification. The novelty is that the testscan detect mis-specification in the predictive densities even if it appears only overa fraction of the sample, due to the presence of instabilities. Our results indicatethat our tests are well sized and have good power in detecting mis-specification inpredictive densities, even when it is time-varying. An application to density forecastsof the Survey of Professional Forecasters demonstrates the usefulness of the proposedmethodologies.
Resumo:
This paper proposes new methodologies for evaluating out-of-sample forecastingperformance that are robust to the choice of the estimation window size. The methodologies involve evaluating the predictive ability of forecasting models over a wide rangeof window sizes. We show that the tests proposed in the literature may lack the powerto detect predictive ability and might be subject to data snooping across differentwindow sizes if used repeatedly. An empirical application shows the usefulness of themethodologies for evaluating exchange rate models' forecasting ability.