37 resultados para Tessellation-based model


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study explores the ongoing pedagogical development of a number of undergraduate design and engineering programmes in the United Kingdom. Observations and data have been collected over several cohorts to bring a valuable perspective to the approaches piloted across two similar university departments while trialling a number of innovative learning strategies. In addition to the concurrent institutional studies the work explores curriculum design that applies the principles of Co-Design, multidisciplinary and trans disciplinary learning, with both engineering and product design students working alongside each other through a practical problem solving learning approach known as the CDIO learning initiative (Conceive, Design Implement and Operate) [1]. The study builds on previous work presented at the 2010 EPDE conference: The Effect of Personality on the Design Team: Lessons from Industry for Design Education [2]. The subsequent work presented in this paper applies the findings to mixed design and engineering team based learning, building on the insight gained through a number of industrial process case studies carried out in current design practice. Developments in delivery also aligning the CDIO principles of learning through doing into a practice based, collaborative learning experience and include elements of the TRIZ creative problem solving technique [3]. The paper will outline case studies involving a number of mixed engineering and design student projects that highlight the CDIO principles, combined with an external industrial design brief. It will compare and contrast the learning experience with that of a KTP derived student project, to examine an industry based model for student projects. In addition key areas of best practice will be presented, and student work from each mode will be discussed at the conference.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge maintenance is a major challenge for both knowledge management and the Semantic Web. Operating over the Semantic Web, there will be a network of collaborating agents, each with their own ontologies or knowledge bases. Change in the knowledge state of one agent may need to be propagated across a number of agents and their associated ontologies. The challenge is to decide how to propagate a change of knowledge state. The effects of a change in knowledge state cannot be known in advance, and so an agent cannot know who should be informed unless it adopts a simple ‘tell everyone – everything’ strategy. This situation is highly reminiscent of the classic Frame Problem in AI. We argue that for agent-based technologies to succeed, far greater attention must be given to creating an appropriate model for knowledge update. In a closed system, simple strategies are possible (e.g. ‘sleeping dog’ or ‘cheap test’ or even complete checking). However, in an open system where cause and effect are unpredictable, a coherent cost-benefit based model of agent interaction is essential. Otherwise, the effectiveness of every act of knowledge update/maintenance is brought into question.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cellular thiols are critical moieties in signal transduction, regulation of gene expression, and ultimately are determinants of specific protein activity. Whilst protein bound thiols are the critical effector molecules, low molecular weight thiols, such as glutathione, play a central role in cytoprotection through (1) direct consumption of oxidants, (2) regeneration of protein thiols and (3) export of glutathione containing mixed disulphides. The brain is particularly vulnerable to oxidative stress, as it consumes 20% of oxygen load, contains high concentrations of polyunsaturated fatty acids and iron in certain regions, and expresses low concentrations of enzymic antioxidants. There is substantial evidence for a role for oxidative stress in neurodegenerative disease, where excitotoxic, redox cycling and mitochondrial dysfunction have been postulated to contribute to the enhanced oxidative load. Others have suggested that loss of important trophic factors may underlie neurodegeneration. However, the two are not mutually exclusive; using cell based model systems, low molecular weight antioxidants have been shown to play an important neuroprotective role in vitro, where neurotrophic factors have been suggested to modulate glutathione levels. Glutathione levels are regulated by substrate availability, synthetic enzyme and metabolic enzyme activity, and by the presence of other antioxidants, which according to the redox potential, consume or regenerate GSH from its oxidised partner. Therefore we have investigated the hypothesis that amyloid beta neurotoxicity is mediated by reactive oxygen species, where trophic factor cytoprotection against oxidative stress is achieved through regulation of glutathione levels. Using PC12 cells as a model system, amyloid beta 25-35 caused a shift in DCF fluorescence after four hours in culture. This fluorescence shift was attenuated by both desferioxamine and NGF. After four hours, cellular glutathione levels were depleted by as much as 75%, however, 24 hours following oxidant exposure, glutathione concentration was restored to twice the concentration seen in controls. NGF prevented both the loss of viability seen after 24 hours amyloid beta treatment and also protected glutathione levels. NGF decreased the total cellular glutathione concentration but did not affect expression of GCS. In conclusion, loss of glutathione precedes cell death in PC12 cells. However, at sublethal doses the surviving fraction respond to oxidative stress by increasing glutathione levels, where this is achieved, at least in part, at the gene level through upregulation of GCS. Whilst NGF does protect against oxidative toxicity, this is not achieved through upregulation of GCS or glutathione.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper contributes to the literature on the intra-firm diffusion of innovations by investigating the factors that affect the firm’s decision to adopt and use sets of complementary innovations. We define complementary innovations those innovations whose joint use generates super additive gains, i.e. the gain from the joint adoption is higher than the sum of the gains derived from the adoption of each innovation in isolation. From a theoretical perspective, we present a simple decision model, whereby the firm decides ‘whether’ and ‘how much’ to invest in each of the innovations under investigation based upon the expected profit gain from each possible combination of adoption and use. The model shows how the extent of complementarity among the innovations can affect the firm’s profit gains and therefore the likelihood that the firm will adopt these innovations jointly, rather than individually. From an empirical perspective, we focus on four sets of management practices, namely operating (OMP), monitoring (MMP), targets (TMP) and incentives (IMP) management practices. We show that these sets of practices, although to a different extent, are complementary to each other. Then, we construct a synthetic indicator of the depth of their use. The resulting intra-firm index is built to reflect not only the number of practices adopted but also the depth of their individual use and the extent of their complementarity. The empirical testing of the decision model is carried out using the evidence from the adoption behaviour of a sample of 1,238 UK establishments present in the 2004 Workplace Employment Relations Survey (WERS). Our empirical results show that the intra-firm profitability based model is a good model in that it can explain more of the variability of joint adoption than models based upon the variability of adoption and use of individual practices. We also investigate whether a number of firm specific and market characteristics by affecting the size of the gains (which the joint adoption of innovations can generate) may drive the intensity of use of the four innovations. We find that establishment size, whether foreign owned, whether exposed to an international market and the degree of homogeneity of the final product are important determinants of the intensity of the joint adoption of the four innovations. Most importantly, our results point out that the factors that the economics of innovation literature has been showing to affect the intensity of use of a technological innovation do also affect the intensity of use of sets of innovative management practices. However, they can explain only a small part of the diversity of their joint adoption use by the firms in the sample.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data Envelopment Analysis (DEA) is a nonparametric method for measuring the efficiency of a set of decision making units such as firms or public sector agencies, first introduced into the operational research and management science literature by Charnes, Cooper, and Rhodes (CCR) [Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The original DEA models were applicable only to technologies characterized by positive inputs/outputs. In subsequent literature there have been various approaches to enable DEA to deal with negative data. In this paper, we propose a semi-oriented radial measure, which permits the presence of variables which can take both negative and positive values. The model is applied to data on a notional effluent processing system to compare the results with those yielded by two alternative methods for dealing with negative data in DEA: The modified slacks-based model suggested by Sharp et al. [Sharp, J.A., Liu, W.B., Meng, W., 2006. A modified slacks-based measure model for data envelopment analysis with ‘natural’ negative outputs and inputs. Journal of Operational Research Society 57 (11) 1–6] and the range directional model developed by Portela et al. [Portela, M.C.A.S., Thanassoulis, E., Simpson, G., 2004. A directional distance approach to deal with negative data in DEA: An application to bank branches. Journal of Operational Research Society 55 (10) 1111–1121]. A further example explores the advantages of using the new model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a multi-agent based model to simulate a population which comprises of two ethnic groups and a peacekeeping force. We investigate the effects of different strategies for civilian movement to the resulting violence in this bi-communal population. Specifically, we compare and contrast random and race-based migration strategies. Race-based migration leads the formation of clusters. Previous work in this area has shown that same-race clustering instigates violent behavior in otherwise passive segments of the population. Our findings confirm this. Furthermore, we show that in settings where only one of the two races adopts race-based migration it is a winning strategy especially in violently predisposed populations. On the other hand, in relatively peaceful settings clustering is a restricting factor which causes the race that adopts it to drift into annihilation. Finally, we show that when race-based migration is adopted as a strategy by both ethnic groups it results in peaceful co-existence even in the most violently predisposed populations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the policies of (1) restricting social influence and (2) imposing curfews upon interacting citizens in a community. We compare and contrast their effects on the social order and the emerging levels of civil violence. Influence models have been used in the past in the context of decision making in a variety of application domains. The policy of curfews has been utilised with the aim of curbing social violence but little research has been done on its effectiveness. We develop a multi-agent-based model that is used to simulate a community of citizens and the police force that guards it. We find that restricting social influence does indeed pacify rebellious societies, but has the opposite effect on peaceful ones. On the other hand, our simple model indicates that restricting mobility through curfews has a pacifying effect across all types of society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We overview our recent developments in the theory of dispersion-managed (DM) solitons within the context of optical applications. First, we present a class of localized solutions with a period multiple to that of the standard DM soliton in the nonlinear Schrödinger equation with periodic variations of the dispersion. In the framework of a reduced ordinary differential equation-based model, we discuss the key features of these structures, such as a smaller energy compared to traditional DM solitons with the same temporal width. Next, we present new results on dissipative DM solitons, which occur in the context of mode-locked lasers. By means of numerical simulations and a reduced variational model of the complex Ginzburg-Landau equation, we analyze the influence of the different dissipative processes that take place in a laser.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a stochastic agent-based model for the distribution of personal incomes in a developing economy. We start with the assumption that incomes are determined both by individual labour and by stochastic effects of trading and investment. The income from personal effort alone is distributed about a mean, while the income from trade, which may be positive or negative, is proportional to the trader's income. These assumptions lead to a Langevin model with multiplicative noise, from which we derive a Fokker-Planck (FP) equation for the income probability density function (IPDF) and its variation in time. We find that high earners have a power law income distribution while the low-income groups have a Levy IPDF. Comparing our analysis with the Indian survey data (obtained from the world bank website: http://go.worldbank.org/SWGZB45DN0) taken over many years we obtain a near-perfect data collapse onto our model's equilibrium IPDF. Using survey data to relate the IPDF to actual food consumption we define a poverty index (Sen A. K., Econometrica., 44 (1976) 219; Kakwani N. C., Econometrica, 48 (1980) 437), which is consistent with traditional indices, but independent of an arbitrarily chosen "poverty line" and therefore less susceptible to manipulation. Copyright © EPLA, 2010.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study investigates the critical role that opinion leaders (or influentials) play in the adoption process of new products. Recent existing reseach evidence indicates a limited effect of opinion leaders on diffusion processes, yet these studies take into account merely the network position of opinion leaders without addressing their influential power. Empirical findings of our study show that opinion leaders, in addition to having a more central network position, possess more accurate knowledge about a product and tend to be less susceptible to norms and more innovative. Experiments that address these attributes, using an agent-based model, demonstrate that opinion leaders increase the speed of the information stream and the adoption process itself. Furthermore, they increase the maximum adoption percentage. These results indicate that targeting opinion leaders remains a valuable marketing strategy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While conventional Data Envelopment Analysis (DEA) models set targets for each operational unit, this paper considers the problem of input/output reduction in a centralized decision making environment. The purpose of this paper is to develop an approach to input/output reduction problem that typically occurs in organizations with a centralized decision-making environment. This paper shows that DEA can make an important contribution to this problem and discusses how DEA-based model can be used to determine an optimal input/output reduction plan. An application in banking sector with limitation in IT investment shows the usefulness of the proposed method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this research was to investigate the effects of Processing Instruction (VanPatten, 1996, 2007), as an input-based model for teaching second language grammar, on Syrian learners’ processing abilities. The present research investigated the effects of Processing Instruction on the acquisition of English relative clauses by Syrian learners in the form of a quasi-experimental design. Three separate groups were involved in the research (Processing Instruction, Traditional Instruction and a Control Group). For assessment, a pre-test, a direct post-test and a delayed post-test were used as main tools for eliciting data. A questionnaire was also distributed to participants in the Processing Instruction group to give them the opportunity to give feedback in relation to the treatment they received in comparison with the Traditional Instruction they are used to. Four hypotheses were formulated on the possible effectivity of Processing Instruction on Syrian learners’ linguistic system. It was hypothesised that Processing Instruction would improve learners’ processing abilities leading to an improvement in learners’ linguistic system. This was expected to lead to a better performance when it comes to the comprehension and production of English relative clauses. The main source of data was analysed statistically using the ANOVA test. Cohen’s d calculations were also used to support the ANOVA test. Cohen’s d showed the magnitude of effects of the three treatments. Results of the analysis showed that both Processing Instruction and Traditional Instruction groups had improved after treatment. However, the Processing Instruction Group significantly outperformed the other two groups in the comprehension of relative clauses. The analysis concluded that Processing Instruction is a useful tool for instructing relative clauses to Syrian learners. This was enhanced by participants’ responses to the questionnaire as they were in favour of Processing Instruction, rather than Traditional Instruction. This research has theoretical and pedagogical implications. Theoretically, the study showed support for the Input hypothesis. That is, it was shown that Processing Instruction had a positive effect on input processing as it affected learners’ linguistic system. This was reflected in learners’ performance where learners were able to produce a structure which they had not been asked to produce. Pedagogically, the present research showed that Processing Instruction is a useful tool for teaching English grammar in the context where the experiment was carried out, as it had a large effect on learners’ performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite much anecdotal and oftentimes empirical evidence that black and ethnic minority employees do not feel integrated into organisational life and the implications of this lack of integration for their career progression, there is a dearth of research on the nature of the relationship black and ethnic minority employees have with their employing organisations. Additionally, research examining the relationship between diversity management and work outcomes has returned mixed findings. Scholars have attributed this to the lack of an empirically validated measure of workforce diversity management. Accordingly, I sought to address these gaps in the extant literature in a two-part study grounded in social exchange theory. In Study 1, I developed and validated a measure of workforce diversity management practices. Data obtained from a sample of ethnic minority employees from a cross section of organisations provided support for the validity of the scale. In Study 2, I proposed and tested a social-exchange-based model of the relationship between black and ethnic minority employees’ and their employing organisations, as well as assessed the implications of this relationship for their work outcomes. Specifically, I hypothesised: (i) perception of support for diversity, perception of overall justice, and developmental experiences (indicators of integration into organisational life) as mediators of the relationship between diversity management and social exchange with organisation; (ii) the moderating influence of diversity climate on the relationship between diversity management and these indicators of integration; and (iii) the work outcomes of social exchange with organisation defined in terms of career satisfaction, turnover intention and strain. SEM results provide support for most of the hypothesised relationships. The findings of the study contribute to the literature on workforce diversity management in a number of ways. First, the development and validation of a diversity management practice scale constitutes a first step in resolving the difficulty in operationalising and measuring the diversity management construct. Second, it explicates how and why diversity management practices influence a social exchange relationship with an employing organisation, and the implications of this relationship for the work outcomes of black and ethnic minority employees. My study’s focus on employee work outcomes is an important corrective to the predominant focus on organisational-level outcomes of diversity management. Lastly, by focusing on ethno-racial diversity my research complements the extant research on such workforce diversity indicators as age and gender.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with "negative absorption" of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors-random distributed feedback fibre laser-was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100km. Although an effective reflection due to the Rayleigh scattering is extremely small (~0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the generation of a stationary near-Gaussian beam with a narrow spectrum. A random distributed feedback fibre laser has efficiency and performance that are comparable to and even exceed those of similar conventional fibre lasers. The key features of the generated radiation of random distributed feedback fibre lasers include: a stationary narrow-band continuous modeless spectrum that is free of mode competition, nonlinear power broadening, and an output beam with a Gaussian profile in the fundamental transverse mode (generated both in single mode and multi-mode fibres).This review presents the current status of research in the field of random fibre lasers and shows their potential and perspectives. We start with an introductory overview of conventional distributed feedback lasers and traditional random lasers to set the stage for discussion of random fibre lasers. We then present a theoretical analysis and experimental studies of various random fibre laser configurations, including widely tunable, multi-wavelength, narrow-band generation, and random fibre lasers operating in different spectral bands in the 1-1.6μm range. Then we discuss existing and future applications of random fibre lasers, including telecommunication and distributed long reach sensor systems. A theoretical description of random lasers is very challenging and is strongly linked with the theory of disordered systems and kinetic theory. We outline two key models governing the generation of random fibre lasers: the average power balance model and the nonlinear Schrödinger equation based model. Recently invented random distributed feedback fibre lasers represent a new and exciting field of research that brings together such diverse areas of science as laser physics, the theory of disordered systems, fibre optics and nonlinear science. Stable random generation in optical fibre opens up new possibilities for research on wave transport and localization in disordered media. We hope that this review will provide background information for research in various fields and will stimulate cross-disciplinary collaborations on random fibre lasers. © 2014 Elsevier B.V.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a novel intonation modelling approach and demonstrates its applicability using the Standard Yorùbá language. Our approach is motivated by the theory that abstract and realised forms of intonation and other dimensions of prosody should be modelled within a modular and unified framework. In our model, this framework is implemented using the Relational Tree (R-Tree) technique. The R-Tree is a sophisticated data structure for representing a multi-dimensional waveform in the form of a tree. Our R-Tree for an utterance is generated in two steps. First, the abstract structure of the waveform, called the Skeletal Tree (S-Tree), is generated using tone phonological rules for the target language. Second, the numerical values of the perceptually significant peaks and valleys on the S-Tree are computed using a fuzzy logic based model. The resulting points are then joined by applying interpolation techniques. The actual intonation contour is synthesised by Pitch Synchronous Overlap Technique (PSOLA) using the Praat software. We performed both quantitative and qualitative evaluations of our model. The preliminary results suggest that, although the model does not predict the numerical speech data as accurately as contemporary data-driven approaches, it produces synthetic speech with comparable intelligibility and naturalness. Furthermore, our model is easy to implement, interpret and adapt to other tone languages.