968 resultados para operational semantics
Resumo:
Contexto: La idea principal es la creación de una empresa para la fabricación y comercialización de productos cosméticos de alta gama. A pesar del contexto de crisis económica en el que estamos envueltos estos últimos años y del que no parece que acabamos de salir, el sector de la cosmética está resistiendo la situación de forma envidiable. De la crisis, como dicen los economistas, “se saldrá y se volverá, todo es cíclico” y viendo la fortaleza que éste sector está mostrando unido a la necesidad de que nuestra economía abandone sectores muy deteriorados y sin una previsión positiva de futuro, la cosmética se postula con un futuro prometedor dentro del contexto económico español y europeo. “Vinci Cosmetics” es el nombre inicialmente pensado para que la nueva compañía inicie su andadura. Nombre con tintes latinos de la lejana época del imperio romano como homenaje a la rica herencia histórica de la ciudad del autor del proyecto - Tarragona. La cultura por cuidarse y sentirse bien, muy arraigada en la mentalidad latina y mediterránea, es un factor cultural clave para el desarrollo y futura evolución de esta industria. El slogan de la empresa, “Dieta Mediterránea para tu piel”, clarifica en buena medida la idea a desarrollar y los objetivos pretendidos por la estrategia empresarial. Objetivos: El objetivo esencial del TFC es elaborar un estudio detallado para la creación de una empresa de fabricación y comercialización de productos cosméticos de alta gama que abarque el management y la gestión por procesos de la empresa, la función de marketing, el tipo de operaciones y procesos a realizar, la gestión del factor humano y el presupuesto necesario para cubrir este ambicioso proyecto. En detalle, un completo plan de empresa que marque las directrices de la organización industrial que se pretende crear compuesto a su vez por cinco planes: plan de gerencia, plan de marketing, plan de operaciones, plan de recursos humanos y plan económico – financiero. Procedimientos: El proyecto tiene un alto componente de estudio de mercados y de marketing pero pretende también abarcar el management, los procesos de operación, el factor humano y el aspecto económico y financiero de las inversiones y presupuesto necesario. El estudio inicial centrará su esfuerzo en un análisis del mercado de la península ibérica, y, en función del avance y del progreso esperado por la empresa, la compañía podría extender su campo de acción a Europa aunque no antes de un medio plazo. “Vinci Cosmetics” ha adoptado referenciales de sistemas de gestión integrados en toda la organización. Así tenemos principalmente, por un lado, la norma internacional ISO 9001:2008 y, por otro, el Modelo EFQM de Excelencia en la gestión; ambos plasmados a partir de un enfoque basado en procesos que nos garantiza el control continuo y la gestión excelente. Conclusiones: Tras evaluar la situación actual del mercado de la cosmética, las previsiones futuras de éste y las necesidades que una organización industrial necesita, se puede crear - con plenas garantías de éxito como organización empresarial y desde el punto de vista económico - una empresa de cosméticos de alta gama para cubrir las necesidades de una parte de mercado que lo requiere.
Resumo:
La empresa moderna, se encuentra en una situación actual de incertidumbre en la que además está más expuesta a factores externos con el aumento de la competencia y el impacto de la globalización. Se enfrenta a un panorama que debe ser gestionado adecuadamente para asegurar su supervivencia y el éxito de su existencia. En este TFC, se presenta una de las herramientas ‘buque insignia’ del Management moderno, aquella que está destinada a ser la base de la Gestión de la empresa y que cada vez más se hace imprescindible para la construcción de un futuro cada vez más incierto, la Planificación Estratégica. El Plan Estratégico 2012-2015 de AESL se desarrolla en varios apartados que intentarán dar con las claves de la dirección que la empresa debe seguir y propone los objetivos que encontrará a través de la puesta en marcha de las acciones y estrategias determinadas. La primera etapa, y la más extensa, realiza un Análisis de la situación externa e interna de la empresa además de un Diagnóstico de la misma. Se analizan diversos factores que pueden contribuir o hacer peligrar el futuro de la empresa, y constituye el punto de partida de la planificación. En la segunda etapa, y con la información de la anterior fase, comienza la fijación de los Objetivos Corporativos y la definición de las Decisiones Estratégicas que han de transformar nuestro rumbo actual en el logro de esos objetivos. Las estrategias no sólo comportarán un ámbito corporativo, también se centrarán en estrategias funcionales con un carácter más definitorio en el interior de la empresa. Por último, la tercera y última etapa, aglutinará todas estas Decisiones estratégicas para transformarlas en las Decisiones Operativas, aquellas que se encargarán de gestionar el día a día de la empresa a través de los Planes de Acción, que suponen la etapa de mayor concreción del Plan. Este Plan a 3 años deberá erigirse como la herramienta de planificación más importante de la empresa, debiendo guiar las principales decisiones tomadas y convirtiéndose en un instrumento de consulta no sólo para decidir hoy lo que se hará en el futuro sino también para mantener unos niveles altos de competitividad en el tremendo esfuerzo que supone la gestión de una PYME.
Resumo:
The competitiveness of businesses is increasingly dependent on their electronic networks with customers, suppliers, and partners. While the strategic and operational impact of external integration and IOS adoption has been extensively studied, much less attention has been paid to the organizational and technical design of electronic relationships. The objective of our longitudinal research project is the development of a framework for understanding and explaining B2B integration. Drawing on existing literature and empirical cases we present a reference model (a classification scheme for B2B Integration). The reference model comprises technical, organizational, and institutional levels to reflect the multiple facets of B2B integration. In this paper we onvestigate the current state of electronic collaboration in global supply chains focussing on the technical view. Using an indepth case analysis we identify five integration scenarios. In the subsequent confirmatory phase of the research we analyse 112 real-world company cases to validate these five integration scenarios. Our research advances and deepens existing studies by developing a B2B reference model, which reflects the current state of practice and is independent of specific implementation technologies. In the next stage of the research the emerging reference model will be extended to create an assessment model for analysing the maturity level of a given company in a specific supply chain.
Resumo:
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.
Resumo:
Ground clutter caused by anomalous propagation (anaprop) can affect seriously radar rain rate estimates, particularly in fully automatic radar processing systems, and, if not filtered, can produce frequent false alarms. A statistical study of anomalous propagation detected from two operational C-band radars in the northern Italian region of Emilia Romagna is discussed, paying particular attention to its diurnal and seasonal variability. The analysis shows a high incidence of anaprop in summer, mainly in the morning and evening, due to the humid and hot summer climate of the Po Valley, particularly in the coastal zone. Thereafter, a comparison between different techniques and datasets to retrieve the vertical profile of the refractive index gradient in the boundary layer is also presented. In particular, their capability to detect anomalous propagation conditions is compared. Furthermore, beam path trajectories are simulated using a multilayer ray-tracing model and the influence of the propagation conditions on the beam trajectory and shape is examined. High resolution radiosounding data are identified as the best available dataset to reproduce accurately the local propagation conditions, while lower resolution standard TEMP data suffers from interpolation degradation and Numerical Weather Prediction model data (Lokal Model) are able to retrieve a tendency to superrefraction but not to detect ducting conditions. Observing the ray tracing of the centre, lower and upper limits of the radar antenna 3-dB half-power main beam lobe it is concluded that ducting layers produce a change in the measured volume and in the power distribution that can lead to an additional error in the reflectivity estimate and, subsequently, in the estimated rainfall rate.
Resumo:
Weather radar observations are currently the most reliable method for remote sensing of precipitation. However, a number of factors affect the quality of radar observations and may limit seriously automated quantitative applications of radar precipitation estimates such as those required in Numerical Weather Prediction (NWP) data assimilation or in hydrological models. In this paper, a technique to correct two different problems typically present in radar data is presented and evaluated. The aspects dealt with are non-precipitating echoes - caused either by permanent ground clutter or by anomalous propagation of the radar beam (anaprop echoes) - and also topographical beam blockage. The correction technique is based in the computation of realistic beam propagation trajectories based upon recent radiosonde observations instead of assuming standard radio propagation conditions. The correction consists of three different steps: 1) calculation of a Dynamic Elevation Map which provides the minimum clutter-free antenna elevation for each pixel within the radar coverage; 2) correction for residual anaprop, checking the vertical reflectivity gradients within the radar volume; and 3) topographical beam blockage estimation and correction using a geometric optics approach. The technique is evaluated with four case studies in the region of the Po Valley (N Italy) using a C-band Doppler radar and a network of raingauges providing hourly precipitation measurements. The case studies cover different seasons, different radio propagation conditions and also stratiform and convective precipitation type events. After applying the proposed correction, a comparison of the radar precipitation estimates with raingauges indicates a general reduction in both the root mean squared error and the fractional error variance indicating the efficiency and robustness of the procedure. Moreover, the technique presented is not computationally expensive so it seems well suited to be implemented in an operational environment.
Resumo:
Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest duration corresponding to the maturity stage (approximately 80% of the total time).
Resumo:
The current operational very short-term and short-term quantitative precipitation forecast (QPF) at the Meteorological Service of Catalonia (SMC) is made by three different methodologies: Advection of the radar reflectivity field (ADV), Identification, tracking and forecasting of convective structures (CST) and numerical weather prediction (NWP) models using observational data assimilation (radar, satellite, etc.). These precipitation forecasts have different characteristics, lead time and spatial resolutions. The objective of this study is to combine these methods in order to obtain a single and optimized QPF at each lead time. This combination (blending) of the radar forecast (ADV and CST) and precipitation forecast from NWP model is carried out by means of different methodologies according to the prediction horizon. Firstly, in order to take advantage of the rainfall location and intensity from radar observations, a phase correction technique is applied to the NWP output to derive an additional corrected forecast (MCO). To select the best precipitation estimation in the first and second hour (t+1 h and t+2 h), the information from radar advection (ADV) and the corrected outputs from the model (MCO) are mixed by using different weights, which vary dynamically, according to indexes that quantify the quality of these predictions. This procedure has the ability to integrate the skill of rainfall location and patterns that are given by the advection of radar reflectivity field with the capacity of generating new precipitation areas from the NWP models. From the third hour (t+3 h), as radar-based forecasting has generally low skills, only the quantitative precipitation forecast from model is used. This blending of different sources of prediction is verified for different types of episodes (convective, moderately convective and stratiform) to obtain a robust methodology for implementing it in an operational and dynamic way.
Resumo:
Achieving a high degree of dependability in complex macro-systems is challenging. Because of the large number of components and numerous independent teams involved, an overview of the global system performance is usually lacking to support both design and operation adequately. A functional failure mode, effects and criticality analysis (FMECA) approach is proposed to address the dependability optimisation of large and complex systems. The basic inductive model FMECA has been enriched to include considerations such as operational procedures, alarm systems. environmental and human factors, as well as operation in degraded mode. Its implementation on a commercial software tool allows an active linking between the functional layers of the system and facilitates data processing and retrieval, which enables to contribute actively to the system optimisation. The proposed methodology has been applied to optimise dependability in a railway signalling system. Signalling systems are typical example of large complex systems made of multiple hierarchical layers. The proposed approach appears appropriate to assess the global risk- and availability-level of the system as well as to identify its vulnerabilities. This enriched-FMECA approach enables to overcome some of the limitations and pitfalls previously reported with classical FMECA approaches.
Resumo:
Medicine counterfeiting is a crime that has increased in recent years and now involves the whole world. Health and economic repercussions have led pharmaceutical industries and agencies to develop many measures to protect genuine medicines and differentiate them from counterfeits. Detecting counterfeit is chemically relatively simple for the specialists, but much more information can be gained from the analyses in a forensic intelligence perspective. Analytical data can feed criminal investigation and law enforcement by detecting and understanding the criminal phenomenon. Profiling seizures using chemical and packaging data constitutes a strong way to detect organised production and industrialised forms of criminality, and is the focus of this paper. Thirty-three seizures of a commonly counterfeited type of capsule have been studied. The results of the packaging and chemical analyses were gathered within an organised database. Strong linkage was found between the seizures at the different production steps, indicating the presence of a main counterfeit network dominating the market. The interpretation of the links with circumstantial data provided information about the production and the distribution of counterfeits coming from this network. This forensic intelligence perspective has the potential to be generalised to other types of products. This may be the only reliable approach to help the understanding of the organised crime phenomenon behind counterfeiting and to enable efficient strategic and operational decision making in an attempt to dismantle counterfeit network.
Resumo:
In this paper we highlight the importance of the operational costs in explaining economic growth and analyze how the industrial structure affects the growth rate of the economy. If there is monopolistic competition only in an intermediate goods sector, then production growth coincides with consumption growth. Moreover, the pattern of growth depends on the particular form of the operational cost. If the monopolistically competitive sector is the final goods sector, then per capita production is constant but per capita effective consumption or welfare grows. Finally, we modify again the industrial structure of the economy and show an economy with two different growth speeds, one for production and another for effective consumption. Thus, both the operational cost and the particular structure of the sector that produces the final goods determines ultimately the pattern of growth.
Resumo:
The choice to adopt risk-sensitive measurement approaches for operational risks: the case of Advanced Measurement Approach under Basel II New Capital Accord This paper investigates the choice of the operational risk approach under Basel II requirements and whether the adoption of advanced risk measurement approaches allows banks to save capital. Among the three possible approaches for operational risk measurement, the Advanced Measurement Approach (AMA) is the most sophisticated and requires the use of historical loss data, the application of statistical tools, and the engagement of a highly qualified staff. Our results provide evidence that the adoption of AMA is contingent on the availability of bank resources and prior experience in risk-sensitive operational risk measurement practices. Moreover, banks that choose AMA exhibit low requirements for capital and, as a result might gain a competitive advantage compared to banks that opt for less sophisticated approaches. - Internal Risk Controls and their Impact on Bank Solvency Recent cases in financial sector showed the importance of risk management controls on risk taking and firm performance. Despite advances in the design and implementation of risk management mechanisms, there is little research on their impact on behavior and performance of firms. Based on data from a sample of 88 banks covering the period between 2004 and 2010, we provide evidence that internal risk controls impact the solvency of banks. In addition, our results show that the level of internal risk controls leads to a higher degree of solvency in banks with a major shareholder in contrast to widely-held banks. However, the relationship between internal risk controls and bank solvency is negatively affected by BHC growth strategies and external restrictions on bank activities, while the higher regulatory requirements for bank capital moderates positively this relationship. - The Impact of the Sophistication of Risk Measurement Approaches under Basel II on Bank Holding Companies Value Previous research showed the importance of external regulation on banks' behavior. Some inefficient standards may accentuate risk-taking in banks and provoke a financial crisis. Despite the growing literature on the potential effects of Basel II rules, there is little empirical research on the efficiency of risk-sensitive capital measurement approaches and their impact on bank profitability and market valuation. Based on data from a sample of 66 banks covering the period between 2008 and 2010, we provide evidence that prudential ratios computed under Basel II standards predict the value of banks. However, this relation is contingent on the degree of sophistication of risk measurement approaches that banks apply. Capital ratios are effective in predicting bank market valuation when banks adopt the advanced approaches to compute the value of their risk-weighted assets.
Resumo:
The distribution of plants along environmental gradients is constrained by abiotic and biotic factors. Cumulative evidence attests of the impact of biotic factors on plant distributions, but only few studies discuss the role of belowground communities. Soil fungi, in particular, are thought to play an important role in how plant species assemble locally into communities. We first review existing evidence, and then test the effect of the number of soil fungal operational taxonomic units (OTUs) on plant species distributions using a recently collected dataset of plant and metagenomic information on soil fungi in the Western Swiss Alps. Using species distribution models (SDMs), we investigated whether the distribution of individual plant species is correlated to the number of OTUs of two important soil fungal classes known to interact with plants: the Glomeromycetes, that are obligatory symbionts of plants, and the Agaricomycetes, that may be facultative plant symbionts, pathogens, or wood decayers. We show that including the fungal richness information in the models of plant species distributions improves predictive accuracy. Number of fungal OTUs is especially correlated to the distribution of high elevation plant species. We suggest that high elevation soil show greater variation in fungal assemblages that may in turn impact plant turnover among communities. We finally discuss how to move beyond correlative analyses, through the design of field experiments manipulating plant and fungal communities along environmental gradients.
Resumo:
The planning effort for ISP began in 2006 when the IDOC retained the Durrant/PBA team of architects and planners to review the Iowa correctional system. The team conducted two studies in the following two years, the first being the April 2007 Iowa Department of Corrections Systemic Master Plan. Both studies addressed myriad aspects of the correctional system including treatment and re-entry needs and programs, security and training, and staffing.