959 resultados para Equity Portfolio with Equal Weights


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Computed tomography (CT) and magnetic resonance imaging (MRI) are introduced as an alternative to traditional autopsy. The purpose of this study was to investigate their accuracy in mass estimation of liver and spleen. METHODS: In 44 cases, the weights of spleen and liver were estimated based on MRI and CT data using a volume-analysis software and a postmortem tissue-specific density factor. In a blinded approach, the results were compared with the weights noted at autopsy. RESULTS: Excellent correlation between estimated and real weights (r = 0.997 for MRI, r = 0.997 for CT) was found. Putrefaction gas and venous air embolism led to an overestimation. Venous congestion and drowning caused higher estimated weights. CONCLUSION: Postmortem weights of liver and spleen can accurately be assessed by nondestructive imaging. Multislice CT overcomes the limitation of putrefaction and venous air embolism by the possibility to exclude gas. Congestion seems to be even better assessed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since 1991, 6 years after the recommendation of universal childhood triple vaccination against measles, mumps and rubella (M + M + R), Switzerland has been confronted with an increasing number of mumps cases affecting both vaccinated and unvaccinated children. The M + M + R vaccine mainly used in the Swiss population after 1986 contains the highly attenuated Rubini strain of mumps virus. We analysed an outbreak of 102 suspected mumps cases by virus isolation, determination of IgM antibodies to mumps virus in 27 acute phase sera, and verification of vaccination histories. Mumps was confirmed by virus isolation in 88 patients, of whom 72 had previously received the Rubini vaccine strain. IgM antibodies to mumps virus were detected in 24/27 acute phase serum samples. A group of 92 subjects from the same geographic area without signs of mumps virus infection served as controls. IgG antibodies to mumps virus and vaccination status were assessed in these children. The vaccination rate in these controls was 61%, with equal seropositivity for unvaccinated and Rubini-vaccinated subjects. These data support other recent reports which indicate an insufficient protective efficacy of current mumps vaccines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The Fractional Flow Reserve Versus Angiography for Multivessel Evaluation (FAME) 2 trial demonstrated a significant reduction in subsequent coronary revascularization among patients with stable angina and at least 1 coronary lesion with a fractional flow reserve ≤0.80 who were randomized to percutaneous coronary intervention (PCI) compared with best medical therapy. The economic and quality-of-life implications of PCI in the setting of an abnormal fractional flow reserve are unknown. METHODS AND RESULTS We calculated the cost of the index hospitalization based on initial resource use and follow-up costs based on Medicare reimbursements. We assessed patient utility using the EQ-5D health survey with US weights at baseline and 1 month and projected quality-adjusted life-years assuming a linear decline over 3 years in the 1-month utility improvements. We calculated the incremental cost-effectiveness ratio based on cumulative costs over 12 months. Initial costs were significantly higher for PCI in the setting of an abnormal fractional flow reserve than with medical therapy ($9927 versus $3900, P<0.001), but the $6027 difference narrowed over 1-year follow-up to $2883 (P<0.001), mostly because of the cost of subsequent revascularization procedures. Patient utility was improved more at 1 month with PCI than with medical therapy (0.054 versus 0.001 units, P<0.001). The incremental cost-effectiveness ratio of PCI was $36 000 per quality-adjusted life-year, which was robust in bootstrap replications and in sensitivity analyses. CONCLUSIONS PCI of coronary lesions with reduced fractional flow reserve improves outcomes and appears economically attractive compared with best medical therapy among patients with stable angina.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for squarks and gluinos in final states containing jets, missing transverse momentum and no high-p(T) electrons or muons is presented. The data represent the complete sample recorded in 2011 by the ATLAS experiment in 7 TeV proton-proton collisions at the Large Hadron Collider, with a total integrated luminosity of 4.7 fb(-1). No excess above the Standard Model background expectation is observed. Gluino masses below 860 GeV and squark masses below 1320 GeV are excluded at the 95% confidence level in simplified models containing only squarks of the first two generations, a gluino octet and a massless neutralino, for squark or gluino masses below 2 TeV, respectively. Squarks and gluinos with equal masses below 1410 GeV are excluded. In minimal supergravity/constrained minimal supersymmetric Standard Model models with tan beta = 10, A(0) = 0 and mu > 0, squarks and gluinos of equal mass are excluded for masses below 1360 GeV. Constraints are also placed on the parameter space of supersymmetric models with compressed spectra. These limits considerably extend the region of supersymmetric parameter space excluded by previous measurements with the ATLAS detector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An updated search is performed for gluino, top squark, or bottom squark R-hadrons that have come to rest within the ATLAS calorimeter, and decay at some later time to hadronic jets and a neutralino, using 5.0 and 22.9 fb(-1) of pp collisions at 7 and 8 TeV, respectively. Candidate decay events are triggered in selected empty bunch crossings of the LHC in order to remove pp collision backgrounds. Selections based on jet shape and muon system activity are applied to discriminate signal events from cosmic ray and beam-halo muon backgrounds. In the absence of an excess of events, improved limits are set on gluino, stop, and sbottom masses for different decays, lifetimes, and neutralino masses. With a neutralino of mass 100 GeV, the analysis excludes gluinos with mass below 832 GeV (with an expected lower limit of 731 GeV), for a gluino lifetime between 10 mu s and 1000 s in the generic R-hadron model with equal branching ratios for decays to q (q) over bar(chi) over tilde (0) and g (chi) over tilde (0). Under the same assumptions for the neutralino mass and squark lifetime, top squarks and bottom squarks in the Regge R-hadron model are excluded with masses below 379 and 344 GeV, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Congenital anomalies have been a leading cause of infant mortality for the past twenty years in the United States. Few registry-based studies have investigated the mortality experience of infants with congenital anomalies. Therefore, a registry-based mortality study was conducted of 2776 infants from the Texas Birth Defects Registry who were born January 1, 1995 to December 31, 1997, with selected congenital anomalies. Infants were matched to linked birth-infant death files from the Texas Department of Health, Bureau of Vital Statistics. One year Kaplan-Meier survival curves, and mortality estimates were generated for each of the 23 anomalies by maternal race/ethnicity, infant sex, birth weight, gestational age, number of life-threatening anomalies, prenatal diagnosis, hospital of birth and other variables. ^ There were 523 deaths within the first year of life (mortality rate = 191.0 per 1,000 infants). Infants with gastroschisis, trisomy 21, and cleft lip ± palate had the highest first year survival (92.91%, 92.32%, and 87.59%, respectively). Anomalies with the lowest survival were anencephaly (5.13%), trisomy 13 (7.41%), and trisomy 18 (10.29%). ^ Infants born to White, Non-Hispanic women had the highest first year survival (83.57%; 95% CI: 80.91, 85.88), followed by African-Americans (82.43%; 95% CI: 76.98, 86.70) and Hispanics (79.28%; 95% CI: 77.19, 81.21). Infants with birth weights ≥2500 grams and gestational ages ≥37 weeks also had the highest first year survival. First year mortality drastically increased as the number of life-threatening anomalies increased. Mortality was also higher for infants with anomalies that were prenatally diagnosed. Slight differences existed in survival based on infant's place of delivery. ^ In logistic regression analysis, birth weight (<1500 grams: OR = 7.48; 95% CI: 5.42, 10.33; 1500–2499 grams: OR = 3.48; 95% CI: 2.74, 4.42), prenatal diagnosis (OR = 1.92; 95% CI: 1.43, 2.58) and number of life-threatening anomalies (≥3: OR = 22.45; 95% CI: 11.67, 43.18) were the strongest predictors of death within the first year of life for all infants with selected congenital anomalies. To achieve further reduction in the infant mortality rate in the United States, additional research is needed to identify ways to reduce mortality among infants with congenital anomalies. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUMEN Los procesos de diseño de zonas o diseño del territorio implican la partición de un espacio geográfico, organizado en un conjunto de unidades de área, en diferentes regiones o zonas según un conjunto especifico de criterios que varían en función del campo de aplicación. En la mayoría de los casos, el objetivo fundamental consiste en crear zonas de tamaño aproximadamente igual respecto a uno o varios atributos de medida -de carácter cuantitativo- (zonas con igual número de habitantes, igual promedio de ventas...). Sin embargo, están apareciendo nuevas aplicaciones, algunas en el contexto de las políticas de desarrollo sostenible, cuya finalidad es la definición de regiones con un tamaño predeterminado, no necesariamente similar. Además, en estos casos las zonas han de formarse en torno a un conjunto específico de posiciones, semillas o generadores. Este tipo de particiones no han sido lo suficientemente investigadas, de manera que no se conocen modelos de solución para la delimitación automática de las zonas. En esta tesis se ha diseñado un nuevo método basado en una versión discreta del diagrama de Voronoi con peso aditivo adaptativo (DVPAA), que permite la partición de un espacio bidimensional en zonas de un tamaño específico, considerando tanto la posición como el peso de cada uno de los generadores. El método consiste en resolver repetidamente un tradicional diagrama de Voronoi con peso aditivo, de forma que los pesos de cada generador se actualizan en cada iteración. En el proceso de cálculo de distancias se usa una métrica basada en el camino más corto, lo que garantiza que la partición obtenida esté formada por un conjunto de zonas conexas. La heurística diseñada se integra en una aplicación prototipo, desarrollada en un entorno SIG (Sistemas de Información Geográfica), que permite el trazado automático de zonas según los criterios anteriormente expuestos. Para analizar la viabilidad del método se ha utilizado como caso de estudio la gestión de los recursos pastorales para la ganadería extensiva en tres municipios de Castilla-La Mancha. Las pruebas realizadas ponen de manifiesto que la heurística diseñada, adaptada a los criterios que se plantean en el contexto de la gestión de sistemas extensivos agropecuarios, es válida para resolver este tipo de problemas de partición. El método propuesto se caracteriza por su eficacia en el tratamiento de un gran número de unidades superficiales en formato vectorial, generando soluciones que convergen con relativa rapidez y verifican los criterios establecidos. En el caso estudiado, aunque la posición prefijada de los generadores reduce considerablemente la complejidad del problema, existen algunas configuraciones espaciales de estos elementos para las que el algoritmo no encuentra una solución satisfactoria, poniéndose de manifiesto una de las limitaciones de este modelo. Tal y como se ha podido comprobar, la localización de los generadores puede tener un considerable impacto en la zonificación resultante, por lo que, de acuerdo con Kalcsics et al. (2005), una selección "inadecuada" difícilmente puede generar regiones válidas que verifiquen los criterios establecidos. ABSTRACT Tenitory or zone design processes entail partitioning a geographic space, organized as a set of basic areal units, into different regions or zones according to a specific set of entena that are dependent on the application context. In most cases the aim is to create zones that have approximately equal sizes with respect to one or several measure attributes (zones with equal numbers of inhabitants, same average sales, etc). However, some of the new applications that have emerged, particularly in the context of sustainable development policies, are aimed at defining zones of a predetermined, though not necessarily similar, size. In addition, the zones should be built around a given set of positions, seeds or generators. This type of partitioning has not been sufñciently researched; therefore there are no known approaches for automated zone delimitation. This thesis proposes a new method based on a discrete versión of the Adaptive Additively Weighted Voronoi Diagram (AAWVD) that makes it possible to partition a 2D space into zones of specific sizes, taking both the position and the weight of each (seed) generator into account. The method consists of repeatedly solving a traditional additively weighted Voronoi diagram, so that the weights of each generator are updated at every iteration. The partition s zones are geographically connected nsing a metric based 011 the shortest path. The proposed heuristic lias been included in an application, developed in a GIS environment that allows the automated zone delimitation according to the mentioned criteria. The management of the extensive farming system of three municipalities of Castilla-La Mancha (Spain) has been used as study case to analyze the viability of the method. The tests carried out have established that the proposed method, adapted to the criteria of this application field, is valid for solving this type of partition problem. The applied algorithm is capable of handling a high number of vector areal units, generating solutions that converge in a reasonable CPU time and comply with the imposed constraints. Although the complexity of this problem is greatly reduced when the generator's positions are fixed, in many cases, these positions impose a spatial confignration that the algorithm proposed is unable to solve, thus revealing one of the limitations of this method. It has been shown that the location of the generators has a considerable impact on the final solution, so that, as Kalcsics et al. (2005) observed, an "inadequate" selection can hardly generate valid zones that comply with the established criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabajo estudia la aportación que los métodos de agregación de juicios de expertos pueden realizar en el cálculo de la peligrosidad sísmica de emplazamientos. Se han realizado cálculos en dos emplazamientos de la Península Ibérica: Mugardos (La Coruña) y Cofrentes (Valencia) que están sometidos a regímenes tectónicos distintos y que, además, alojan instalaciones industriales de gran responsabilidad. Las zonas de estudio, de 320 Km de radio, son independientes. Se ha aplicado un planteamiento probabilista a la estimación de la tasa anual de superación de valores de la aceleración horizontal de pico y se ha utilizado el Método de Montecarlo para incorporar a los resultados la incertidumbre presente en los datos relativos a la definición de cada fuente sismogenética y de su sismicidad. Los cálculos se han operado mediante un programa de ordenador, desarrollado para este trabajo, que utiliza la metodología propuesta por el Senior Seismic Hazard Analysis Commitee (1997) para la NRC. La primera conclusión de los resultados ha sido que la Atenuación es la fuente principal de incertidumbre en las estimaciones de peligrosidad en ambos casos. Dada la dificultad de completar los datos históricos disponibles de esta variable se ha estudiado el comportamiento de cuatro métodos matemáticos de agregación de juicios de expertos a la hora de estimar una ley de atenuación en un emplazamiento. Los datos de partida se han obtenido del Catálogo de Isosistas del IGN. Los sismos utilizados como variables raíz se han elegido con el criterio de cubrir uniformemente la serie histórica disponible y los valores de magnitud observados. Se ha asignado un panel de expertos particular a cada uno de los dos emplazamientos y se han aplicado a sus juicios los métodos de Cooke, equipesos, Apostolakis_Mosleh y Morris. Sus propuestas se han comparado con los datos reales para juzgar su eficacia y su facilidad de operación. A partir de los resultados se ha concluido que el método de Cooke ha mostrado el comportamiento más eficiente y robusto para ambos emplazamientos. Este método, además, ha permitido identificar, razonadamente, a aquellos expertos que no deberían haberse introducido en un panel. The present work analyses the possible contribution of the mathematical methods of aggregation in the assessment of Seismic Hazzard. Two sites, in the Iberian Peninsula, have been considered: Mugardos ( La Coruña) and Cofrentes (Valencia).Both of them are subjected to different tectonic regimes an both accommodate high value industrial plants. Their areas of concern, with radius of 320 Km, are not overlapping. A probabilistic approach has been applied in the assessment the annual probability of exceedence of the horizontal peak acceleration. The Montecarlo Method has allowed to transfer the uncertainty in the models and parameters to the final results. A computer program has been developed for this purpose. The methodology proposed by the Senior Seismic Analysis Committee (1997) for the NRC has been considered. Attenuation in Ground motion has been proved to be the main source of uncertainty in seismic hazard for both sites. Taking into account the difficulties to complete existing historical data in this subject the performance of four mathematical methods of aggregation has been studied. Original data have been obtained from the catalogs of the Spanish National Institute of Geography. The seismic events considered were chosen to cover evenly the historical records and the observed values of magnitude. A panel of experts have been applied to each site and four aggregation methods have been developed : equal weights, Cooke, Apostolakis-Mosleh and Morris The four proposals have been compaired with the actual data to judge their performance and ease of application. The results have shown that the Method of Cooke have proved the most efficient and robust for both sites. This method, besides, allow the reasoned identification of those experts who should be rejected from the panel

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Agricultural water management needs to evolve in view of increased water scarcity, especially when farming and natural protected areas are closely linked. In the study site of Don?ana (southern Spain), water is shared by rice producers and a world heritage biodiversity ecosystem. Our aim is to contribute to defining adaptation strategies that may build resilience to increasing water scarcity and minimize water conflicts among agricultural and natural systems. The analytical framework links a participatory process with quantitative methods to prioritize the adaptation options. Bottom-up proposed adaptation measures are evaluated by a multi-criteria analysis (MCA) that includes both socioeconomic criteria and criteria of the ecosystem services affected by the adaptation options. Criteria weights are estimated by three different methods?analytic hierarchy process, Likert scale and equal weights?that are then compared. Finally, scores from an MCA are input into an optimization model used to determine the optimal land-use distribution in order to maximize utility and land-use diversification according to different scenarios of funds and water availability. While our results show a spectrum of perceptions of priorities among stakeholders, there is one overriding theme that is to define a way to restore part of the rice fields to natural wetlands. These results hold true under the current climate scenario and evenmore so under an increased water scarcity scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Is it profitable for an investor, from a risk-return perspective, to acquire a stake in a quoted company when a capital increase is announced? This paper analyses the return obtained from the investment in equity issues with cash contribution and pre-emptive rights, aimed at funding corporate activities: acquisitions, investments in new facilities and/or strengthening the balance sheet of the companies undertaking the equity issue. During the 16 years covered by the study, the results show a negative average excess risk-adjusted return of almost 5%, from the moment that the equity offer is announced until the completion of the preferential subscription period. To obtain this excess return, the difference between the nominal Internal Rate of Return (IRR) and the expected return, using the CAPM, is computed for each equity issue. The intention behind this method is to eliminate the effects of time and any other possible effect on the stock price during the period of the analysis.The results from this article are consistent with the Pecking Order theory for the Spanish Stock Market also six months after the preferential subscription period. However, there is a positive return after three months.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The objective of the present research is to examine the relationship between consumers' satisfaction with a retailer and the equity they associate with the retail brand. Design/methodology/approach – Retail brand equity is conceptualized as a four-dimensional construct comprising: retailer awareness, retailer associations, retailer perceived quality, and retailer loyalty. Then the associative network memory model is applied from cognitive psychology to the specific context of the relationships between customer satisfaction and consumer-based retailer equity. A survey was undertaken using a convenience sample of shopping mall consumers in an Australian state capital city. The questionnaire used to collect data included an experimental design such that two categories of retailers were included in the study: department stores and specialty stores, with three retailers representing each category. The relationship between consumer-based retailer equity and customer satisfaction was examined using multivariate analysis of variance. Findings – Results indicate that retail brand equity varies with customer satisfaction. For department stores, each consumer-based retailer equity dimension varied according to customer satisfaction with the retailer. However, for specialty stores, only three of the consumer-based retailer equity dimensions, namely retailer awareness, retailer associations and retailer perceived quality, varied according to customer satisfaction level with the retailer. Originality/value – The principal contribution of the present research is that it demonstrates empirically a positive relationship between customer satisfaction and an intangible asset such as retailer equity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research extends the consumer-based brand equity measurement approach to the measurement of the equity associated with retailers. This paper also addresses some of the limitations associated with current retailer equity measurement such as a lack of clarity regarding its nature and dimensionality. We conceptualise retailer equity as a four-dimensional construct comprising retailer awareness, retailer associations, perceived retailer quality, and retailer loyalty. The paper reports the result of an empirical study of a convenience sample of 601 shopping mall consumers at an Australian state capital city. Following a confirmatory factor analysis using structural equation modelling to examine the dimensionality of the retailer equity construct, the proposed model is tested for two retailer categories: department stores and speciality stores. Results confirm the hypothesised four-dimensional structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Down's syndrome, first described by J. Langdon Down in 1866, is the most common chromosomal abnormality to occur in the human population. Its incidence is approximately 1/650 of all births although the risk of having a Down's child increases markedly with the age of the mother. It occurs with equal frequency in all racial groups. The risk to a mother 16-26 years old is 1 in 1,300 but the risk increases to 1 in 30 for a mother 45-47 years old. The life expectancy of people with Down's syndrome has risen since the 1920s and many individuals are now living to the 5th decade or beyond. Consequently optometrists are increasingly likley to see Down's patients of all ages in the practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: Primary 47A48, Secondary 60G12.