914 resultados para Model transformation analysis
Resumo:
Zur Senkung von Kosten werden in vielen Unternehmen Dienstleistungen, die nicht zur Kernkompetenz gehören, an externe Dienstleister ausgelagert. Dieser Prozess wird auch als Outsourcing bezeichnet. Die dadurch entstehenden Abhängigkeiten zu den externen Dienstleistern werden mit Hilfe von Service Level Agreements (SLAs) vertraglich geregelt. Die Aufgabe des Service Level Managements (SLM) ist es, die Einhaltung der vertraglich fixierten Dienstgüteparameter zu überwachen bzw. sicherzustellen. Für eine automatische Bearbeitung ist daher eine formale Spezifikation von SLAs notwendig. Da der Markt eine Vielzahl von unterschiedlichen SLM-Werkzeugen hervorgebracht hat, entstehen in der Praxis Probleme durch proprietäre SLA-Formate und fehlende Spezifikationsmethoden. Daraus resultiert eine Werkzeugabhängigkeit und eine limitierte Wiederverwendbarkeit bereits spezifizierter SLAs. In der vorliegenden Arbeit wird ein Ansatz für ein plattformunabhängiges Service Level Management entwickelt. Ziel ist eine Vereinheitlichung der Modellierung, so dass unterschiedliche Managementansätze integriert und eine Trennung zwischen Problem- und Technologiedomäne erreicht wird. Zudem wird durch die Plattformunabhängigkeit eine hohe zeitliche Stabilität erstellter Modelle erreicht. Weiteres Ziel der Arbeit ist, die Wiederverwendbarkeit modellierter SLAs zu gewährleisten und eine prozessorientierte Modellierungsmethodik bereitzustellen. Eine automatisierte Etablierung modellierter SLAs ist für eine praktische Nutzung von entscheidender Relevanz. Zur Erreichung dieser Ziele werden die Prinzipien der Model Driven Architecture (MDA) auf die Problemdomäne des Service Level Managements angewandt. Zentrale Idee der Arbeit ist die Definition von SLA-Mustern, die konfigurationsunabhängige Abstraktionen von Service Level Agreements darstellen. Diese SLA-Muster entsprechen dem Plattformunabhängigen Modell (PIM) der MDA. Durch eine geeignete Modelltransformation wird aus einem SLA-Muster eine SLA-Instanz generiert, die alle notwendigen Konfigurationsinformationen beinhaltet und bereits im Format der Zielplattform vorliegt. Eine SLA-Instanz entspricht damit dem Plattformspezifischen Modell (PSM) der MDA. Die Etablierung der SLA-Instanzen und die daraus resultierende Konfiguration des Managementsystems entspricht dem Plattformspezifischen Code (PSC) der MDA. Nach diesem Schritt ist das Managementsystem in der Lage, die im SLA vereinbarten Dienstgüteparameter eigenständig zu überwachen. Im Rahmen der Arbeit wurde eine UML-Erweiterung definiert, die eine Modellierung von SLA-Mustern mit Hilfe eines UML-Werkzeugs ermöglicht. Hierbei kann die Modellierung rein graphisch als auch unter Einbeziehung der Object Constraint Language (OCL) erfolgen. Für die praktische Realisierung des Ansatzes wurde eine Managementarchitektur entwickelt, die im Rahmen eines Prototypen realisiert wurde. Der Gesamtansatz wurde anhand einer Fallstudie evaluiert.
Resumo:
El desarrollo del presente trabajo constituye, la aplicación del modelo del Análisis Estructural de los Sectores Estratégicos y la identificación de los escenarios de desarrollo alternativos para las empresas del Sector Textil-Confecciones de Bogota, en donde se describe el estado actual de las empresas en aspectos macroeconómicos y macroeconómicos; a continuación, se identifican los estados actuales, logros, alcances, y trazabilidad del sector. Dicha investigación se efectuó en dos etapas: la primera, corresponde al desarrollo del AESE, está compuesto por cuatro fases: en la primera, se evalúan los estados financieros de las empresas y las variables estratégicas para determinar el grado de convergencia estratégica dentro del sector; en la segunda, se lleva a cabo el levantamiento del panorama competitivo para identificar los espacios de mercado no atendidos que representan oportunidad de crecimiento. En la tercera, se realiza un diagnóstico de las Fuerzas del Mercado y en la última fase se realiza el Estudio de Competidores; en la segunda etapa de la investigación se realizo el análisis prospectivo, está compuesto por tres pasos: en el primero, a partir del desarrollo del AESE, se extrajeron las variables claves del sector y su nivel de impacto interno y externo; en el segundo, se determina la influencia que ejercen los actores presentes en el sector y por ultimo mediante una visión gerencial que contempla aspectos sociales, políticos , económicos y tecnológicos se proyectan futuros escenarios cada uno con probabilidad de ocurrencia.
Resumo:
Este artículo pertenece a una sección monográfica de la revista dedicada a educar la mirada: propuestas para enseñar a ver TV. - Resumen tomado parcialmente de la revista.
Resumo:
We hypothesized that although large populations may appear able to withstand predation and disturbance, added stochasticity in population growth rate (λ) increases the risk of dramatic population declines. Approximately half of the Aleutian Islands' population of Least Auklets (Aethia pusilla) breed at one large colony at Kiska Island in the presence of introduced Norway rats (Rattus norvegicus) whose population erupts periodically. We evaluated two management plans, do nothing or eradicate rats, for this colony, and performed stochastic elasticity analysis to focus future research and management. Our results indicated that Least Auklets breeding at Kiska Island had the lowest absolute value of growth rate and more variable λ's (neither statistically significant) during 2001-2010, when compared with rat-free colonies at Buldir and Kasatochi islands. We found variability in the annual proportional change in population size among islands with Kiska Island having the fastest rate of decline, 78% over 20 years. Under the assumption that the eradication of rats would result in vital rates similar to those observed at rat-free Buldir and Kasatochi islands, we found the projected population decline decreased from 78% to 24% over 20 years. Overall, eradicating rats at Kiska Island is not likely to increase Least Auklet vital rates, but will decrease the amount of variation in λ, resulting in a significantly slower rate of population decline. We recommend the eradication of rats from Kiska Island to decrease the probability of dramatic population declines and ensure the future persistence of this important colony.
Resumo:
Understanding the effect of habitat fragmentation is a fundamental yet complicated aim of many ecological studies. Beni savanna is a naturally fragmented forest habitat, where forest islands exhibit variation in resources and threats. To understand how the availability of resources and threats affect the use of forest islands by parrots, we applied occupancy modeling to quantify use and detection probabilities for 12 parrot species on 60 forest islands. The presence of urucuri (Attalea phalerata) and macaw (Acrocomia aculeata) palms, the number of tree cavities on the islands, and the presence of selective logging,and fire were included as covariates associated with availability of resources and threats. The model-selection analysis indicated that both resources and threats variables explained the use of forest islands by parrots. For most species, the best models confirmed predictions. The number of cavities was positively associated with use of forest islands by 11 species. The area of the island and the presence of macaw palm showed a positive association with the probability of use by seven and five species, respectively, while selective logging and fire showed a negative association with five and six species, respectively. The Blue-throated Macaw (Ara glaucogularis), the critically endangered parrot species endemic to our study area, was the only species that showed a negative association with both threats. Monitoring continues to be essential to evaluate conservation and management actions of parrot populations. Understanding of how species are using this natural fragmented habitat will help determine which fragments should be preserved and which conservation actions are needed.
Resumo:
A new method for assessing forecast skill and predictability that involves the identification and tracking of extratropical cyclones has been developed and implemented to obtain detailed information about the prediction of cyclones that cannot be obtained from more conventional analysis methodologies. The cyclones were identified and tracked along the forecast trajectories, and statistics were generated to determine the rate at which the position and intensity of the forecasted storms diverge from the analyzed tracks as a function of forecast lead time. The results show a higher level of skill in predicting the position of extratropical cyclones than the intensity. They also show that there is potential to improve the skill in predicting the position by 1 - 1.5 days and the intensity by 2 - 3 days, via improvements to the forecast model. Further analysis shows that forecasted storms move at a slower speed than analyzed storms on average and that there is a larger error in the predicted amplitudes of intense storms than the weaker storms. The results also show that some storms can be predicted up to 3 days before they are identified as an 850-hPa vorticity center in the analyses. In general, the results show a higher level of skill in the Northern Hemisphere (NH) than the Southern Hemisphere (SH); however, the rapid growth of NH winter storms is not very well predicted. The impact that observations of different types have on the prediction of the extratropical cyclones has also been explored, using forecasts integrated from analyses that were constructed from reduced observing systems. A terrestrial, satellite, and surface-based system were investigated and the results showed that the predictive skill of the terrestrial system was superior to the satellite system in the NH. Further analysis showed that the satellite system was not very good at predicting the growth of the storms. In the SH the terrestrial system has significantly less skill than the satellite system, highlighting the dominance of satellite observations in this hemisphere. The surface system has very poor predictive skill in both hemispheres.
Resumo:
Climate models provide compelling evidence that if greenhouse gas emissions continue at present rates, then key global temperature thresholds (such as the European Union limit of two degrees of warming since pre-industrial times) are very likely to be crossed in the next few decades. However, there is relatively little attention paid to whether, should a dangerous temperature level be exceeded, it is feasible for the global temperature to then return to safer levels in a usefully short time. We focus on the timescales needed to reduce atmospheric greenhouse gases and associated temperatures back below potentially dangerous thresholds, using a state-of-the-art general circulation model. This analysis is extended with a simple climate model to provide uncertainty bounds. We find that even for very large reductions in emissions, temperature reduction is likely to occur at a low rate. Policy-makers need to consider such very long recovery timescales implicit in the Earth system when formulating future emission pathways that have the potential to 'overshoot' particular atmospheric concentrations of greenhouse gases and, more importantly, related temperature levels that might be considered dangerous.
Resumo:
A significant desert dust deposition event occurred on Mt. Elbrus, Caucasus Mountains, Russia on 5 May 2009, where the deposited dust later appeared as a brown layer in the snow pack. An examination of dust transportation history and analysis of chemical and physical properties of the deposited dust were used to develop a new approach for high-resolution “provenancing” of dust deposition events recorded in snow pack using multiple independent techniques. A combination of SEVIRI red-green-blue composite imagery, MODIS atmospheric optical depth fields derived using the Deep Blue algorithm, air mass trajectories derived with HYSPLIT model and analysis of meteorological data enabled identification of dust source regions with high temporal (hours) and spatial (ca. 100 km) resolution. Dust, deposited on 5 May 2009, originated in the foothills of the Djebel Akhdar in eastern Libya where dust sources were activated by the intrusion of cold air from the Mediterranean Sea and Saharan low pressure system and transported to the Caucasus along the eastern Mediterranean coast, Syria and Turkey. Particles with an average diameter below 8 μm accounted for 90% of the measured particles in the sample with a mean of 3.58 μm, median 2.48 μm. The chemical signature of this long-travelled dust was significantly different from the locally-produced dust and close to that of soils collected in a palaeolake in the source region, in concentrations of hematite. Potential addition of dust from a secondary source in northern Mesopotamia introduced uncertainty in the “provenancing” of dust from this event. Nevertheless, the approach adopted here enables other dust horizons in the snowpack to be linked to specific dust transport events recorded in remote sensing and meteorological data archives.
Resumo:
A significant desert dust deposition event occurred on Mt. Elbrus, Caucasus Mountains, Russia on 5 May 2009, where the deposited dust later appeared as a brown layer in the snow pack. An examination of dust transportation history and analysis of chemical and physical properties of the deposited dust were used to develop a new approach for high-resolution provenancing of dust deposition events recorded in snow pack using multiple independent techniques. A combination of SEVIRI red-green-blue composite imagery, MODIS atmospheric optical depth fields derived using the Deep Blue algorithm, air mass trajectories derived with HYSPLIT model and analysis of meteorological data enabled identification of dust source regions with high temporal (hours) and spatial (ca. 100 km) resolution. Dust, deposited on 5 May 2009, originated in the foothills of the Djebel Akhdar in eastern Libya where dust sources were activated by the intrusion of cold air from the Mediterranean Sea and Saharan low pressure system and transported to the Caucasus along the eastern Mediterranean coast, Syria and Turkey. Particles with an average diameter below 8 μm accounted for 90% of the measured particles in the sample with a mean of 3.58 μm, median 2.48 μm and the dominant mode of 0.60 μm. The chemical signature of this long-travelled dust was significantly different from the locally-produced dust and close to that of soils collected in a palaeolake in the source region, in concentrations of hematite and oxides of aluminium, manganese, and magnesium. Potential addition of dust from a secondary source in northern Mesopotamia introduced uncertainty in the provenancing of dust from this event. Nevertheless, the approach adopted here enables other dust horizons in the snowpack to be linked to specific dust transport events recorded in remote sensing and meteorological data archives.
Resumo:
A record of dust deposition events between 2009 and 2012 on Mt. Elbrus, Caucasus Mountains derived from a snow pit and a shallow ice core is presented for the first time for this region. A combination of isotopic analysis, SEVIRI red-green-blue composite imagery, MODIS atmospheric optical depth fields derived using the Deep Blue algorithm, air mass trajectories derived using the HYSPLIT model and analysis of meteorological data enabled identification of dust source regions with high temporal (hours) and spatial (cf. 20–100 km) resolution. Seventeen dust deposition events were detected; fourteen occurred in March–June, one in February and two in October. Four events originated in the Sahara, predominantly in north-eastern Libya and eastern Algeria. Thirteen events originated in the Middle East, in the Syrian Desert and northern Mesopotamia, from a mixture of natural and anthropogenic sources. Dust transportation from Sahara was associated with vigorous Saharan depressions, strong surface winds in the source region and mid-tropospheric south-westerly flow with daily winds speeds of 20–30 m s−1 at 700 hPa level and, although these events were less frequent, they resulted in higher dust concentrations in snow. Dust transportation from the Middle East was associated with weaker depressions forming over the source region, high pressure centered over or extending towards the Caspian Sea and a weaker southerly or south-easterly flow towards the Caucasus Mountains with daily wind speeds of 12–18 m s−1 at 700 hPa level. Higher concentrations of nitrates and ammonium characterise dust from the Middle East deposited on Mt. Elbrus in 2009 indicating contribution of anthropogenic sources. The modal values of particle size distributions ranged between 1.98 μm and 4.16 μm. Most samples were characterised by modal values of 2.0–2.8 μm with an average of 2.6 μm and there was no significant difference between dust from the Sahara and the Middle East.
Resumo:
The slow advective-timescale dynamics of the atmosphere and oceans is referred to as balanced dynamics. An extensive body of theory for disturbances to basic flows exists for the quasi-geostrophic (QG) model of balanced dynamics, based on wave-activity invariants and nonlinear stability theorems associated with exact symmetry-based conservation laws. In attempting to extend this theory to the semi-geostrophic (SG) model of balanced dynamics, Kushner & Shepherd discovered lateral boundary contributions to the SG wave-activity invariants which are not present in the QG theory, and which affect the stability theorems. However, because of technical difficulties associated with the SG model, the analysis of Kushner & Shepherd was not fully nonlinear. This paper examines the issue of lateral boundary contributions to wave-activity invariants for balanced dynamics in the context of Salmon's nearly geostrophic model of rotating shallow-water flow. Salmon's model has certain similarities with the SG model, but also has important differences that allow the present analysis to be carried to finite amplitude. In the process, the way in which constraints produce boundary contributions to wave-activity invariants, and additional conditions in the associated stability theorems, is clarified. It is shown that Salmon's model possesses two kinds of stability theorems: an analogue of Ripa's small-amplitude stability theorem for shallow-water flow, and a finite-amplitude analogue of Kushner & Shepherd's SG stability theorem in which the ‘subsonic’ condition of Ripa's theorem is replaced by a condition that the flow be cyclonic along lateral boundaries. As with the SG theorem, this last condition has a simple physical interpretation involving the coastal Kelvin waves that exist in both models. Salmon's model has recently emerged as an important prototype for constrained Hamiltonian balanced models. The extent to which the present analysis applies to this general class of models is discussed.
Resumo:
Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1, 3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development andpolicymaking.
Resumo:
For an increasing number of applications, mesoscale modelling systems now aim to better represent urban areas. The complexity of processes resolved by urban parametrization schemes varies with the application. The concept of fitness-for-purpose is therefore critical for both the choice of parametrizations and the way in which the scheme should be evaluated. A systematic and objective model response analysis procedure (Multiobjective Shuffled Complex Evolution Metropolis (MOSCEM) algorithm) is used to assess the fitness of the single-layer urban canopy parametrization implemented in the Weather Research and Forecasting (WRF) model. The scheme is evaluated regarding its ability to simulate observed surface energy fluxes and the sensitivity to input parameters. Recent amendments are described, focussing on features which improve its applicability to numerical weather prediction, such as a reduced and physically more meaningful list of input parameters. The study shows a high sensitivity of the scheme to parameters characterizing roof properties in contrast to a low response to road-related ones. Problems in partitioning of energy between turbulent sensible and latent heat fluxes are also emphasized. Some initial guidelines to prioritize efforts to obtain urban land-cover class characteristics in WRF are provided. Copyright © 2010 Royal Meteorological Society and Crown Copyright.
Resumo:
BACKGROUND: Low vitamin D status has been shown to be a risk factor for several metabolic traits such as obesity, diabetes and cardiovascular disease. The biological actions of 1, 25-dihydroxyvitamin D, are mediated through the vitamin D receptor (VDR), which heterodimerizes with retinoid X receptor, gamma (RXRG). Hence, we examined the potential interactions between the tagging polymorphisms in the VDR (22 tag SNPs) and RXRG (23 tag SNPs) genes on metabolic outcomes such as body mass index, waist circumference, waist-hip ratio (WHR), high- and low-density lipoprotein (LDL) cholesterols, serum triglycerides, systolic and diastolic blood pressures and glycated haemoglobin in the 1958 British Birth Cohort (1958BC, up to n = 5,231). We used Multifactor- dimensionality reduction (MDR) program as a non-parametric test to examine for potential interactions between the VDR and RXRG gene polymorphisms in the 1958BC. We used the data from Northern Finland Birth Cohort 1966 (NFBC66, up to n = 5,316) and Twins UK (up to n = 3,943) to replicate our initial findings from 1958BC. RESULTS: After Bonferroni correction, the joint-likelihood ratio test suggested interactions on serum triglycerides (4 SNP - SNP pairs), LDL cholesterol (2 SNP - SNP pairs) and WHR (1 SNP - SNP pair) in the 1958BC. MDR permutation model testing analysis showed one two-way and one three-way interaction to be statistically significant on serum triglycerides in the 1958BC. In meta-analysis of results from two replication cohorts (NFBC66 and Twins UK, total n = 8,183), none of the interactions remained after correction for multiple testing (Pinteraction >0.17). CONCLUSIONS: Our results did not provide strong evidence for interactions between allelic variations in VDR and RXRG genes on metabolic outcomes; however, further replication studies on large samples are needed to confirm our findings.
Resumo:
We propose a new class of neurofuzzy construction algorithms with the aim of maximizing generalization capability specifically for imbalanced data classification problems based on leave-one-out (LOO) cross validation. The algorithms are in two stages, first an initial rule base is constructed based on estimating the Gaussian mixture model with analysis of variance decomposition from input data; the second stage carries out the joint weighted least squares parameter estimation and rule selection using orthogonal forward subspace selection (OFSS)procedure. We show how different LOO based rule selection criteria can be incorporated with OFSS, and advocate either maximizing the leave-one-out area under curve of the receiver operating characteristics, or maximizing the leave-one-out Fmeasure if the data sets exhibit imbalanced class distribution. Extensive comparative simulations illustrate the effectiveness of the proposed algorithms.