846 resultados para Tessellation-based model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with "negative absorption" of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors-random distributed feedback fibre laser-was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100km. Although an effective reflection due to the Rayleigh scattering is extremely small (~0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the generation of a stationary near-Gaussian beam with a narrow spectrum. A random distributed feedback fibre laser has efficiency and performance that are comparable to and even exceed those of similar conventional fibre lasers. The key features of the generated radiation of random distributed feedback fibre lasers include: a stationary narrow-band continuous modeless spectrum that is free of mode competition, nonlinear power broadening, and an output beam with a Gaussian profile in the fundamental transverse mode (generated both in single mode and multi-mode fibres).This review presents the current status of research in the field of random fibre lasers and shows their potential and perspectives. We start with an introductory overview of conventional distributed feedback lasers and traditional random lasers to set the stage for discussion of random fibre lasers. We then present a theoretical analysis and experimental studies of various random fibre laser configurations, including widely tunable, multi-wavelength, narrow-band generation, and random fibre lasers operating in different spectral bands in the 1-1.6μm range. Then we discuss existing and future applications of random fibre lasers, including telecommunication and distributed long reach sensor systems. A theoretical description of random lasers is very challenging and is strongly linked with the theory of disordered systems and kinetic theory. We outline two key models governing the generation of random fibre lasers: the average power balance model and the nonlinear Schrödinger equation based model. Recently invented random distributed feedback fibre lasers represent a new and exciting field of research that brings together such diverse areas of science as laser physics, the theory of disordered systems, fibre optics and nonlinear science. Stable random generation in optical fibre opens up new possibilities for research on wave transport and localization in disordered media. We hope that this review will provide background information for research in various fields and will stimulate cross-disciplinary collaborations on random fibre lasers. © 2014 Elsevier B.V.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a novel intonation modelling approach and demonstrates its applicability using the Standard Yorùbá language. Our approach is motivated by the theory that abstract and realised forms of intonation and other dimensions of prosody should be modelled within a modular and unified framework. In our model, this framework is implemented using the Relational Tree (R-Tree) technique. The R-Tree is a sophisticated data structure for representing a multi-dimensional waveform in the form of a tree. Our R-Tree for an utterance is generated in two steps. First, the abstract structure of the waveform, called the Skeletal Tree (S-Tree), is generated using tone phonological rules for the target language. Second, the numerical values of the perceptually significant peaks and valleys on the S-Tree are computed using a fuzzy logic based model. The resulting points are then joined by applying interpolation techniques. The actual intonation contour is synthesised by Pitch Synchronous Overlap Technique (PSOLA) using the Praat software. We performed both quantitative and qualitative evaluations of our model. The preliminary results suggest that, although the model does not predict the numerical speech data as accurately as contemporary data-driven approaches, it produces synthetic speech with comparable intelligibility and naturalness. Furthermore, our model is easy to implement, interpret and adapt to other tone languages.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests an efficient design and operation philosophy, construction methodology, and logical insurance plans. The risk-based model uses the analytic hierarchy process (AHP), a multiple-attribute decision-making technique, to identify the factors that influence failure on specific segments and to analyze their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The profusion of performance measurement models suggested by Management Accounting literature in the 1990’s is one illustration of the substantial changes in Management Accounting teaching materials since the publication of “Relevance Lost” in 1987. At the same time, in the general context of increasing competition and globalisation it is widely thought that national cultural differences are tending to disappear, meaning that management techniques used in large companies, including performance measurement and management instruments (PMS), tend to be the same, irrespective of the company nationality or location. North American management practice is traditionally described as a contractually based model, mainly focused on financial performance information and measures (FPMs), more shareholder-focused than French companies. Within France, literature historically defined performance as being broadly multidimensional, driven by the idea that there are no universal rules of management and that efficient management takes into account local culture and traditions. As opposed to their North American brethren, French companies are pressured more by the financial institutions that fund them rather than by capital markets. Therefore, they pay greater attention to the long-term because they are not subject to quarterly capital market objectives. Hence, management in France should rely more on long-term qualitative information, less financial, and more multidimensional data to assess performance than their North American counterparts. The objective of this research is to investigate whether large French and US companies’ practices have changed in the way the textbooks have changed with regards to performance measurement and management, or whether cultural differences are still driving differences in performance measurement and management between them. The research findings support the idea that large US and French companies share the same PMS features, influenced by ‘universal’ PM models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Epilepsy is one of the most common neurological disorders, a large fraction of which is resistant to pharmacotherapy. In this light, understanding the mechanisms of epilepsy and its intractable forms in particular could create new targets for pharmacotherapeutic intervention. The current project explores the dynamic changes in neuronal network function in the chronic temporal lobe epilepsy (TLE) in rat and human brain in vitro. I focused on the process of establishment of epilepsy (epileptogenesis) in the temporal lobe. Rhythmic behaviour of the hippocampal neuronal networks in healthy animals was explored using spontaneous oscillations in the gamma frequency band (SγO). The use of an improved brain slice preparation technique resulted in the natural occurence (in the absence of pharmacological stimulation) of rhythmic activity, which was then pharmacologically characterised and compared to other models of gamma oscillations (KA- and CCh-induced oscillations) using local field potential recording technique. The results showed that SγO differed from pharmacologically driven models, suggesting higher physiological relevance of SγO. Network activity was also explored in the medial entorhinal cortex (mEC), where spontaneous slow wave oscillations (SWO) were detected. To investigate the course of chronic TLE establishment, a refined Li-pilocarpine-based model of epilepsy (RISE) was developed. The model significantly reduced animal mortality and demonstrated reduced intensity, yet high morbidy with almost 70% mean success rate of developing spontaneous recurrent seizures. We used SγO to characterize changes in the hippocampal neuronal networks throughout the epileptogenesis. The results showed that the network remained largely intact, demonstrating the subtle nature of the RISE model. Despite this, a reduction in network activity was detected during the so-called latent (no seizure) period, which was hypothesized to occur due to network fragmentation and an abnormal function of kainate receptors (KAr). We therefore explored the function of KAr by challenging SγO with kainic acid (KA). The results demonstrated a remarkable decrease in KAr response during the latent period, suggesting KAr dysfunction or altered expression, which will be further investigated using a variety of electrophysiological and immunocytochemical methods. The entorhinal cortex, together with the hippocampus, is known to play an important role in the TLE. Considering this, we investigated neuronal network function of the mEC during epileptogenesis using SWO. The results demonstrated a striking difference in AMPAr function, with possible receptor upregulation or abnormal composition in the early development of epilepsy. Alterations in receptor function inevitably lead to changes in the network function, which may play an important role in the development of epilepsy. Preliminary investigations were made using slices of human brain tissue taken following surgery for intratctable epilepsy. Initial results showed that oscillogenesis could be induced in human brain slices and that such network activity was pharmacologically similar to that observed in rodent brain. Overall, our findings suggest that excitatory glutamatergic transmission is heavily involved in the process of epileptogenesis. Together with other types of receptors, KAr and AMPAr contribute to epilepsy establishment and may be the key to uncovering its mechanism.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a method for detecting and analyzing the so-called replay attacks in intrusion detection systems, when an intruder contributes a small amount of hostile actions to a recorded session of a legitimate user or process, and replays this session back to the system. The proposed approach can be applied if an automata-based model is used to describe behavior of active entities in a computer system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Vendor-managed inventory (VMI) is a widely used collaborative inventory management policy in which manufacturers manages the inventory of retailers and takes responsibility for making decisions related to the timing and extent of inventory replenishment. VMI partnerships help organisations to reduce demand variability, inventory holding and distribution costs. This study provides empirical evidence that significant economic benefits can be achieved with the use of a genetic algorithm (GA)-based decision support system (DSS) in a VMI supply chain. A two-stage serial supply chain in which retailers and their supplier are operating VMI in an uncertain demand environment is studied. Performance was measured in terms of cost, profit, stockouts and service levels. The results generated from GA-based model were compared to traditional alternatives. The study found that the GA-based approach outperformed traditional methods and its use can be economically justified in small- and medium-sized enterprises (SMEs).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, there has been an increas-ing interest in learning a distributed rep-resentation of word sense. Traditional context clustering based models usually require careful tuning of model parame-ters, and typically perform worse on infre-quent word senses. This paper presents a novel approach which addresses these lim-itations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned represen-tations outperform the publicly available embeddings on 2 out of 4 metrics in the word similarity task, and 6 out of 13 sub tasks in the analogical reasoning task.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We overview our recent developments in the theory of dispersion-managed (DM) solitons within the context of optical applications. First, we present a class of localized solutions with a period multiple to that of the standard DM soliton in the nonlinear Schrödinger equation with periodic variations of the dispersion. In the framework of a reduced ordinary differential equation-based model, we discuss the key features of these structures, such as a smaller energy compared to traditional DM solitons with the same temporal width. Next, we present new results on dissipative DM solitons, which occur in the context of mode-locked lasers. By means of numerical simulations and a reduced variational model of the complex Ginzburg-Landau equation, we analyze the influence of the different dissipative processes that take place in a laser.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Climate change highly impacts on tree growth and also threatens the forest of the karstic terrains. From the 1980s the frequency of decay events of the Pinus nigra Arnold forests showed a marked increase in Hungary. To understanding the vulnerability of Pinus nigra forests to climate change on shallow karstic soils in continental-sub Mediterranean climatic conditions we developed the study of three sampled population in the typical karstic landscape of Veszprém in North Transdanubia. We built our model on non-invasive approach using the annual growth of the individuals. MPI Echam5 climate model and as aridity index the Thornthwaite Agrometeorological Index were used. Our results indicate that soil thickness up to 11 cm has a major influence on the main growth intensity, however, aridity determines the annual growth rate. Our model results showed that the increasing decay frequency in the last decades was a parallel change to the decreasing growth rate of pines. The climate model predicts the similar, increased decay frequency to the presents. Our results can be valid for a wider areas of the periphery of Mediterranean climate zone while the annual-growth based model is a cost-effective and simple method to study the vitality of pine trees in a given area.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubblelike deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the nonfundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Urban growth models have been used for decades to forecast urban development in metropolitan areas. Since the 1990s cellular automata, with simple computational rules and an explicitly spatial architecture, have been heavily utilized in this endeavor. One such cellular-automata-based model, SLEUTH, has been successfully applied around the world to better understand and forecast not only urban growth but also other forms of land-use and land-cover change, but like other models must be fed important information about which particular lands in the modeled area are available for development. Some of these lands are in categories for the purpose of excluding urban growth that are difficult to quantify since their function is dictated by policy. One such category includes voluntary differential assessment programs, whereby farmers agree not to develop their lands in exchange for significant tax breaks. Since they are voluntary, today’s excluded lands may be available for development at some point in the future. Mapping the shifting mosaic of parcels that are enrolled in such programs allows this information to be used in modeling and forecasting. In this study, we added information about California’s Williamson Act into SLEUTH’s excluded layer for Tulare County. Assumptions about the voluntary differential assessments were used to create a sophisticated excluded layer that was fed into SLEUTH’s urban growth forecasting routine. The results demonstrate not only a successful execution of this method but also yielded high goodness-of-fit metrics for both the calibration of enrollment termination as well as the urban growth modeling itself.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cloud computing realizes the long-held dream of converting computing capability into a type of utility. It has the potential to fundamentally change the landscape of the IT industry and our way of life. However, as cloud computing expanding substantially in both scale and scope, ensuring its sustainable growth is a critical problem. Service providers have long been suffering from high operational costs. Especially the costs associated with the skyrocketing power consumption of large data centers. In the meantime, while efficient power/energy utilization is indispensable for the sustainable growth of cloud computing, service providers must also satisfy a user's quality of service (QoS) requirements. This problem becomes even more challenging considering the increasingly stringent power/energy and QoS constraints, as well as other factors such as the highly dynamic, heterogeneous, and distributed nature of the computing infrastructures, etc. ^ In this dissertation, we study the problem of delay-sensitive cloud service scheduling for the sustainable development of cloud computing. We first focus our research on the development of scheduling methods for delay-sensitive cloud services on a single server with the goal of maximizing a service provider's profit. We then extend our study to scheduling cloud services in distributed environments. In particular, we develop a queue-based model and derive efficient request dispatching and processing decisions in a multi-electricity-market environment to improve the profits for service providers. We next study a problem of multi-tier service scheduling. By carefully assigning sub deadlines to the service tiers, our approach can significantly improve resource usage efficiencies with statistically guaranteed QoS. Finally, we study the power conscious resource provision problem for service requests with different QoS requirements. By properly sharing computing resources among different requests, our method statistically guarantees all QoS requirements with a minimized number of powered-on servers and thus the power consumptions. The significance of our research is that it is one part of the integrated effort from both industry and academia to ensure the sustainable growth of cloud computing as it continues to evolve and change our society profoundly.^