861 resultados para Field-based model
Resumo:
We propose a method for detecting and analyzing the so-called replay attacks in intrusion detection systems, when an intruder contributes a small amount of hostile actions to a recorded session of a legitimate user or process, and replays this session back to the system. The proposed approach can be applied if an automata-based model is used to describe behavior of active entities in a computer system.
Resumo:
Vendor-managed inventory (VMI) is a widely used collaborative inventory management policy in which manufacturers manages the inventory of retailers and takes responsibility for making decisions related to the timing and extent of inventory replenishment. VMI partnerships help organisations to reduce demand variability, inventory holding and distribution costs. This study provides empirical evidence that significant economic benefits can be achieved with the use of a genetic algorithm (GA)-based decision support system (DSS) in a VMI supply chain. A two-stage serial supply chain in which retailers and their supplier are operating VMI in an uncertain demand environment is studied. Performance was measured in terms of cost, profit, stockouts and service levels. The results generated from GA-based model were compared to traditional alternatives. The study found that the GA-based approach outperformed traditional methods and its use can be economically justified in small- and medium-sized enterprises (SMEs).
Resumo:
In recent years, there has been an increas-ing interest in learning a distributed rep-resentation of word sense. Traditional context clustering based models usually require careful tuning of model parame-ters, and typically perform worse on infre-quent word senses. This paper presents a novel approach which addresses these lim-itations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned represen-tations outperform the publicly available embeddings on 2 out of 4 metrics in the word similarity task, and 6 out of 13 sub tasks in the analogical reasoning task.
Resumo:
We overview our recent developments in the theory of dispersion-managed (DM) solitons within the context of optical applications. First, we present a class of localized solutions with a period multiple to that of the standard DM soliton in the nonlinear Schrödinger equation with periodic variations of the dispersion. In the framework of a reduced ordinary differential equation-based model, we discuss the key features of these structures, such as a smaller energy compared to traditional DM solitons with the same temporal width. Next, we present new results on dissipative DM solitons, which occur in the context of mode-locked lasers. By means of numerical simulations and a reduced variational model of the complex Ginzburg-Landau equation, we analyze the influence of the different dissipative processes that take place in a laser.
Resumo:
Tanulmányunk a fenntarthatóság beszerzésben való értelmezésével kapcsolatos vizsgálatához szeretne hozzájárulni. A Versenyképesség Kutatás korábbi fázisában a fenntarthatóság beszerzésben való megjelenésével, lehetséges értelmezéseivel, tartalmi elemeivel, azok strukturálásával és azokkal a motivációs tényezőkkel foglalkoztunk, mely a kezdeményezések hátterében állt. Az akkori eredményekre építve szeretnénk a témát folytatni. Áttekintjük a szakirodalom legutóbbi elemzésünk óta született eredményeit és azt szeretnénk vizsgálni, hogy az eltelt idő alatt milyen irányokban folytak a kutatások. Az áttekintésen túl kiemelünk egy-egy olyan területet, amelyek mélyebb elemzése relevánsnak tekinthető, így előremutató lehet. Az első az etika beszerzésben való értelmezése, itt fontos output a fogalmak értelmezése, áttekintése a nemzetközi szakirodalom tapasztalatai alapján, a kutatás egy másik vonulataként szeretnénk azt is vizsgálni, hogy a kutatási eredményekből és a gyakorlati problémákból kiindulva hogyan építhető a matematikai eszköztár segítségével olyan modell, mely gyakorlati relevanciával is bír. _________ This paper aims to provide an overview of the developments in the literature on sustainable purchasing. This serves as the basis of the elaboration of new research topics in the field. Based upon the literature results investigations in two topics are presented. First the issue of purchasing ethics will be investigated, with the aim to identify the effects of developments of purchasing management to ethics in purchasing. Second a new methodology to assess the effect of green criteria to the purchasing decision will be presented.
Resumo:
Climate change highly impacts on tree growth and also threatens the forest of the karstic terrains. From the 1980s the frequency of decay events of the Pinus nigra Arnold forests showed a marked increase in Hungary. To understanding the vulnerability of Pinus nigra forests to climate change on shallow karstic soils in continental-sub Mediterranean climatic conditions we developed the study of three sampled population in the typical karstic landscape of Veszprém in North Transdanubia. We built our model on non-invasive approach using the annual growth of the individuals. MPI Echam5 climate model and as aridity index the Thornthwaite Agrometeorological Index were used. Our results indicate that soil thickness up to 11 cm has a major influence on the main growth intensity, however, aridity determines the annual growth rate. Our model results showed that the increasing decay frequency in the last decades was a parallel change to the decreasing growth rate of pines. The climate model predicts the similar, increased decay frequency to the presents. Our results can be valid for a wider areas of the periphery of Mediterranean climate zone while the annual-growth based model is a cost-effective and simple method to study the vitality of pine trees in a given area.
Resumo:
Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubblelike deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the nonfundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.
Resumo:
Urban growth models have been used for decades to forecast urban development in metropolitan areas. Since the 1990s cellular automata, with simple computational rules and an explicitly spatial architecture, have been heavily utilized in this endeavor. One such cellular-automata-based model, SLEUTH, has been successfully applied around the world to better understand and forecast not only urban growth but also other forms of land-use and land-cover change, but like other models must be fed important information about which particular lands in the modeled area are available for development. Some of these lands are in categories for the purpose of excluding urban growth that are difficult to quantify since their function is dictated by policy. One such category includes voluntary differential assessment programs, whereby farmers agree not to develop their lands in exchange for significant tax breaks. Since they are voluntary, today’s excluded lands may be available for development at some point in the future. Mapping the shifting mosaic of parcels that are enrolled in such programs allows this information to be used in modeling and forecasting. In this study, we added information about California’s Williamson Act into SLEUTH’s excluded layer for Tulare County. Assumptions about the voluntary differential assessments were used to create a sophisticated excluded layer that was fed into SLEUTH’s urban growth forecasting routine. The results demonstrate not only a successful execution of this method but also yielded high goodness-of-fit metrics for both the calibration of enrollment termination as well as the urban growth modeling itself.
Resumo:
With the continued and unprecedented decline of coral reefs worldwide, evaluating the factors that contribute to coral demise is of critical importance. As coral cover declines, macroalgae are becoming more common on tropical reefs. Interactions between these macroalgae and corals may alter the coral microbiome, which is thought to play an important role in colony health and survival. Together, such changes in benthic macroalgae and in the coral microbiome may result in a feedback mechanism that contributes to additional coral cover loss. To determine if macroalgae alter the coral microbiome, we conducted a field-based experiment in which the coral Porites astreoides was placed in competition with five species of macroalgae. Macroalgal contact increased variance in the coral-associated microbial community, and two algal species significantly altered microbial community composition. All macroalgae caused the disappearance of a γ-proteobacterium previously hypothesized to be an important mutualist of P. astreoides. Macroalgal contact also triggered: 1) increases or 2) decreases in microbial taxa already present in corals, 3) establishment of new taxa to the coral microbiome, and 4) vectoring and growth of microbial taxa from the macroalgae to the coral. Furthermore, macroalgal competition decreased coral growth rates by an average of 36.8%. Overall, this study found that competition between corals and certain species of macroalgae leads to an altered coral microbiome, providing a potential mechanism by which macroalgae-coral interactions reduce coral health and lead to coral loss on impacted reefs.
Resumo:
Cloud computing realizes the long-held dream of converting computing capability into a type of utility. It has the potential to fundamentally change the landscape of the IT industry and our way of life. However, as cloud computing expanding substantially in both scale and scope, ensuring its sustainable growth is a critical problem. Service providers have long been suffering from high operational costs. Especially the costs associated with the skyrocketing power consumption of large data centers. In the meantime, while efficient power/energy utilization is indispensable for the sustainable growth of cloud computing, service providers must also satisfy a user's quality of service (QoS) requirements. This problem becomes even more challenging considering the increasingly stringent power/energy and QoS constraints, as well as other factors such as the highly dynamic, heterogeneous, and distributed nature of the computing infrastructures, etc. ^ In this dissertation, we study the problem of delay-sensitive cloud service scheduling for the sustainable development of cloud computing. We first focus our research on the development of scheduling methods for delay-sensitive cloud services on a single server with the goal of maximizing a service provider's profit. We then extend our study to scheduling cloud services in distributed environments. In particular, we develop a queue-based model and derive efficient request dispatching and processing decisions in a multi-electricity-market environment to improve the profits for service providers. We next study a problem of multi-tier service scheduling. By carefully assigning sub deadlines to the service tiers, our approach can significantly improve resource usage efficiencies with statistically guaranteed QoS. Finally, we study the power conscious resource provision problem for service requests with different QoS requirements. By properly sharing computing resources among different requests, our method statistically guarantees all QoS requirements with a minimized number of powered-on servers and thus the power consumptions. The significance of our research is that it is one part of the integrated effort from both industry and academia to ensure the sustainable growth of cloud computing as it continues to evolve and change our society profoundly.^
Resumo:
Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubble-like deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the non-fundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.
Resumo:
Cloud computing realizes the long-held dream of converting computing capability into a type of utility. It has the potential to fundamentally change the landscape of the IT industry and our way of life. However, as cloud computing expanding substantially in both scale and scope, ensuring its sustainable growth is a critical problem. Service providers have long been suffering from high operational costs. Especially the costs associated with the skyrocketing power consumption of large data centers. In the meantime, while efficient power/energy utilization is indispensable for the sustainable growth of cloud computing, service providers must also satisfy a user's quality of service (QoS) requirements. This problem becomes even more challenging considering the increasingly stringent power/energy and QoS constraints, as well as other factors such as the highly dynamic, heterogeneous, and distributed nature of the computing infrastructures, etc. In this dissertation, we study the problem of delay-sensitive cloud service scheduling for the sustainable development of cloud computing. We first focus our research on the development of scheduling methods for delay-sensitive cloud services on a single server with the goal of maximizing a service provider's profit. We then extend our study to scheduling cloud services in distributed environments. In particular, we develop a queue-based model and derive efficient request dispatching and processing decisions in a multi-electricity-market environment to improve the profits for service providers. We next study a problem of multi-tier service scheduling. By carefully assigning sub deadlines to the service tiers, our approach can significantly improve resource usage efficiencies with statistically guaranteed QoS. Finally, we study the power conscious resource provision problem for service requests with different QoS requirements. By properly sharing computing resources among different requests, our method statistically guarantees all QoS requirements with a minimized number of powered-on servers and thus the power consumptions. The significance of our research is that it is one part of the integrated effort from both industry and academia to ensure the sustainable growth of cloud computing as it continues to evolve and change our society profoundly.
Resumo:
Determining the past record of temperature and salinity of ocean surface waters is essential for understanding past changes in climate, such as those which occur across glacial-interglacial transitions. As a useful proxy, the oxygen isotope composition (delta18O) of calcite from planktonic foraminifera has been shown to reflect both surface temperature and seawater delta18O, itself an indicator of global ice volume and salinity (Shackleton, 1974; Rostek et al., 1993, doi:10.1038/364319a0). In addition, magnesium/calcium (Mg/Ca) ratios in foraminiferal calcite show a temperature dependence (Nürnberg, 1995, doi:10.2113/gsjfr.25.4.350; Nürnberg et al., 1996, doi:10.1016/0016-7037(95)00446-7; Lea et al., 1999, doi:10.1016/S0016-7037(99)00197-0) due to the partitioning of Mg during calcification. Here we demonstrate, in a field-based calibration experiment, that the variation of Mg/Ca ratios with temperature is similar for eight species of planktonic foraminifera (when accounting for Mg dissolution effects). Using a multi-species record from the Last Glacial Maximum in the North Atlantic Ocean we found that past temperatures reconstructed from Mg/Ca ratios followed the two other palaeotemperature proxies: faunal abundance (CLIMAP, 1981; Mix et al., 1999, doi:10.1029/1999PA900012) and alkenone saturation (Müller et al., 1998, doi:10.1016/S0016-7037(98)00097-0 ). Moreover, combining Mg/Ca and delta18O data from the same faunal assemblage, we show that reconstructed surface water delta18O from all foraminiferal species record the same glacial-interglacial change-representing changing hydrography and global ice volume. This reinforces the potential of this combined technique in probing past ocean-climate interactions.
Resumo:
This paper considers how far Anglo-Saxon conceptions of have influenced European Union vocational education and training policy, especially given the disparate approaches to VET across Europe. Two dominant approaches can be identified: the dual system (exemplified by Germany); and output based models (exemplified by the NVQ ‘English style’). Within the EU itself, the design philosophy of the English output-based model proved in the first instance influential in attempts to develop tools to establish equivalence between vocational qualifications across Europe, resulting in the learning outcomes approach of the European Qualifications Framework, the credit-based model of European VET Credit System and the task-based construction of occupation profiles exemplified by European Skills, Competences and Occupations. The governance model for the English system is, however, predicated on employer demand for ‘skills’ and this does not fit well with the social partnership model encompassing knowledge, skills and competences that is dominant in northern Europe. These contrasting approaches have led to continual modifications to the tools, as these sought to harmonise and reconcile national VET requirements with the original design. A tension is evident in particular between national and regional approaches to vocational education and training, on the one hand, and the policy tools adopted to align European vocational education and training better with the demands of the labour market, including at sectoral level, on the other. This paper explores these tensions and considers the prospects for the successful operation of these tools, paying particular attention to the European Qualifications Framework, European VET Credit System and European Skills, Competences and Occupations tool and the relationships between them and drawing on studies of the construction and furniture industries.
Resumo:
In this paper we present a convolutional neuralnetwork (CNN)-based model for human head pose estimation inlow-resolution multi-modal RGB-D data. We pose the problemas one of classification of human gazing direction. We furtherfine-tune a regressor based on the learned deep classifier. Next wecombine the two models (classification and regression) to estimateapproximate regression confidence. We present state-of-the-artresults in datasets that span the range of high-resolution humanrobot interaction (close up faces plus depth information) data tochallenging low resolution outdoor surveillance data. We buildupon our robust head-pose estimation and further introduce anew visual attention model to recover interaction with theenvironment. Using this probabilistic model, we show thatmany higher level scene understanding like human-human/sceneinteraction detection can be achieved. Our solution runs inreal-time on commercial hardware