68 resultados para MULTIPLE MEMORY-SYSTEMS
Resumo:
Radio-frequency identification technology (RFID) is a popular modern technology proven to deliver a range of value-added benefits to achieve system and operational efficiency, as well as cost-effectiveness. The operational characteristics of RFID outperform barcodes in many aspects. One of the main challenges for RFID adoption is proving its ability to improve competitiveness. In this paper, we examine multiple real-world examples where RFID technology has been demonstrated to provide significant benefits to industry competitiveness, and also to enhance human experience in the service sector. This paper will explore and survey existing value-added applications of RFID systems in industry and the service sector, with particular focus on applications in retail, logistics, manufacturing, healthcare, leisure and the public sector. © 2012 AICIT.
Resumo:
In the global economy, innovation is one of the most important competitive assets for companies willing to compete in international markets. As competition moves from standardised products to customised ones, depending on each specific market needs, economies of scale are not anymore the only winning strategy. Innovation requires firms to establish processes to acquire and absorb new knowledge, leading to the recent theory of Open Innovation. Knowledge sharing and acquisition happens when firms are embedded in networks with other firms, university, institutions and many other economic actors. Several typologies of innovation and firm networks have been identified, with various geographical spans. One of the first being modelled was the Industrial Cluster (or in Italian Distretto Industriale) which was for long considered the benchmark for innovation and economic development. Other kind of networks have been modelled since the late 1970s; Regional Innovation Systems represent one of the latest and more diffuse model of innovation networks, specifically introduced to combine local networks and the global economy. This model was qualitatively exploited since its introduction, but, together with National Innovation Systems, is among the most inspiring for policy makers and is often cited by them, not always properly. The aim of this research is to setup an econometric model describing Regional Innovation Systems, becoming one the first attempts to test and enhance this theory with a quantitative approach. A dataset of 104 secondary and primary data from European regions was built in order to run a multiple linear regression, testing if Regional Innovation Systems are really correlated to regional innovation and regional innovation in cooperation with foreign partners. Furthermore, an exploratory multiple linear regression was performed to verify which variables, among those describing a Regional Innovation Systems, are the most significant for innovating, alone or with foreign partners. Furthermore, the effectiveness of present innovation policies has been tested based on the findings of the econometric model. The developed model confirmed the role of Regional Innovation Systems for creating innovation even in cooperation with international partners: this represents one of the firsts quantitative confirmation of a theory previously based on qualitative models only. Furthermore the results of this model confirmed a minor influence of National Innovation Systems: comparing the analysis of existing innovation policies, both at regional and national level, to our findings, emerged the need for potential a pivotal change in the direction currently followed by policy makers. Last, while confirming the role of the presence a learning environment in a region and the catalyst role of regional administration, this research offers a potential new perspective for the whole private sector in creating a Regional Innovation System.
Resumo:
The primary aim of this research is to understand what constitutes management accounting and control (MACs) practice and how these control processes are implicated in the day to day work practices and operations of the organisation. It also examines the changes that happen in MACs practices over time as multiple actors within organisational settings interact with each other. I adopt a distinctive practice theory approach (i.e. sociomateriality) and the concept of imbrication in this research to show that MACs practices emerge from the entanglement between human/social agency and material/technological agency within an organisation. Changes in the pattern of MACs practices happens in imbrication processes which are produced as the two agencies entangle. The theoretical approach employed in this research offers an interesting and valuable lens which seeks to reveal the depth of these interactions and uncover the way in which the social and material imbricate. The theoretical framework helps to reveal how these constructions impact on and produce modifications of MACs practices. The exploration of the control practices at different hierarchical levels (i.e. from the operational to middle management and senior level management) using the concept of imbrication process also maps the dynamic flow of controls from operational to top management and vice versa in the organisation. The empirical data which is the focus of this research has been gathered from a case study of an organisation involved in a large vertically integrated palm oil industry company in Malaysia specifically the refinery sector. The palm oil industry is a significant industry in Malaysia as it contributed an average of 4.5% of Malaysian Gross Domestic Product, over the period 1990 -2010. The Malaysian palm oil industry also has a significant presence in global food oil supply where it contributed 26% of the total oils and fats global trade in 2010. The case organisation is a significant contributor to the Malaysian palm oil industry. The research access has provided an interesting opportunity to explore the interactions between different groups of people and material/technology in a relatively heavy process food industry setting. My research examines how these interactions shape and are shaped by control practices in a dynamic cycle of imbrications over both short and medium time periods.
Resumo:
The chapter discusses both the complementary factors and contradictions of adoption ERP-based systems with Enterprise 2.0. ERP is well known as IT's efficient business process management. Enterprise 2.0 supports flexible business process management, informal, and less structured interactions. Traditional studies indicate efficiency and flexibility may seem incompatible because they are different business objectives and may exist in different organizational environments. However, the chapter breaks traditional norms that combine ERP and Enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on multiple case studies, the chapter analyzes the benefits and risks of the combination of ERP with Enterprise 2.0 from process, organization, and people paradigms. © 2013 by IGI Global.
Resumo:
This thesis is a study of performance management of Complex Event Processing (CEP) systems. Since CEP systems have distinct characteristics from other well-studied computer systems such as batch and online transaction processing systems and database-centric applications, these characteristics introduce new challenges and opportunities to the performance management for CEP systems. Methodologies used in benchmarking CEP systems in many performance studies focus on scaling the load injection, but not considering the impact of the functional capabilities of CEP systems. This thesis proposes the approach of evaluating the performance of CEP engines’ functional behaviours on events and develops a benchmark platform for CEP systems: CEPBen. The CEPBen benchmark platform is developed to explore the fundamental functional performance of event processing systems: filtering, transformation and event pattern detection. It is also designed to provide a flexible environment for exploring new metrics and influential factors for CEP systems and evaluating the performance of CEP systems. Studies on factors and new metrics are carried out using the CEPBen benchmark platform on Esper. Different measurement points of response time in performance management of CEP systems are discussed and response time of targeted event is proposed to be used as a metric for quality of service evaluation combining with the traditional response time in CEP systems. Maximum query load as a capacity indicator regarding to the complexity of queries and number of live objects in memory as a performance indicator regarding to the memory management are proposed in performance management of CEP systems. Query depth is studied as a performance factor that influences CEP system performance.
Resumo:
The link between off-target anticholinergic effects of medications and acute cognitive impairment in older adults requires urgent investigation. We aimed to determine whether a relevant in vitro model may aid the identification of anticholinergic responses to drugs and the prediction of anticholinergic risk during polypharmacy. In this preliminary study we employed a co-culture of human-derived neurons and astrocytes (NT2.N/A) derived from the NT2 cell line. NT2.N/A cells possess much of the functionality of mature neurons and astrocytes, key cholinergic phenotypic markers and muscarinic acetylcholine receptors (mAChRs). The cholinergic response of NT2 astrocytes to the mAChR agonist oxotremorine was examined using the fluorescent dye fluo-4 to quantitate increases in intracellular calcium [Ca2+]i. Inhibition of this response by drugs classified as severe (dicycloverine, amitriptyline), moderate (cyclobenzaprine) and possible (cimetidine) on the Anticholinergic Cognitive Burden (ACB) scale, was examined after exposure to individual and pairs of compounds. Individually, dicycloverine had the most significant effect regarding inhibition of the astrocytic cholinergic response to oxotremorine, followed by amitriptyline then cyclobenzaprine and cimetidine, in agreement with the ACB scale. In combination, dicycloverine with cyclobenzaprine had the most significant effect, followed by dicycloverine with amitriptyline. The order of potency of the drugs in combination frequently disagreed with predicted ACB scores derived from summation of the individual drug scores, suggesting current scales may underestimate the effect of polypharmacy. Overall, this NT2.N/A model may be appropriate for further investigation of adverse anticholinergic effects of multiple medications, in order to inform clinical choices of suitable drug use in the elderly.
Resumo:
We report an extension of the procedure devised by Weinstein and Shanks (Memory & Cognition 36:1415-1428, 2008) to study false recognition and priming of pictures. Participants viewed scenes with multiple embedded objects (seen items), then studied the names of these objects and the names of other objects (read items). Finally, participants completed a combined direct (recognition) and indirect (identification) memory test that included seen items, read items, and new items. In the direct test, participants recognized pictures of seen and read items more often than new pictures. In the indirect test, participants' speed at identifying those same pictures was improved for pictures that they had actually studied, and also for falsely recognized pictures whose names they had read. These data provide new evidence that a false-memory induction procedure can elicit memory-like representations that are difficult to distinguish from "true" memories of studied pictures. © 2012 Psychonomic Society, Inc.
Resumo:
This article demonstrates the use of embedded fibre Bragg gratings as vector bending sensor to monitor two-dimensional shape deformation of a shape memory polymer plate. The shape memory polymer plate was made by using thermal-responsive epoxy-based shape memory polymer materials, and the two fibre Bragg grating sensors were orthogonally embedded, one on the top and the other on the bottom layer of the plate, in order to measure the strain distribution in both longitudinal and transverse directions separately and also with temperature reference. When the shape memory polymer plate was bent at different angles, the Bragg wavelengths of the embedded fibre Bragg gratings showed a red-shift of 50 pm/°caused by the bent-induced tensile strain on the plate surface. The finite element method was used to analyse the stress distribution for the whole shape recovery process. The strain transfer rate between the shape memory polymer and optical fibre was also calculated from the finite element method and determined by experimental results, which was around 0.25. During the experiment, the embedded fibre Bragg gratings showed very high temperature sensitivity due to the high thermal expansion coefficient of the shape memory polymer, which was around 108.24 pm/°C below the glass transition temperature (Tg) and 47.29 pm/°C above Tg. Therefore, the orthogonal arrangement of the two fibre Bragg grating sensors could provide a temperature compensation function, as one of the fibre Bragg gratings only measures the temperature while the other is subjected to the directional deformation. © The Author(s) 2013.
Resumo:
We overview our recent developments in the theory of dispersion-managed (DM) solitons within the context of optical applications. First, we present a class of localized solutions with a period multiple to that of the standard DM soliton in the nonlinear Schrödinger equation with periodic variations of the dispersion. In the framework of a reduced ordinary differential equation-based model, we discuss the key features of these structures, such as a smaller energy compared to traditional DM solitons with the same temporal width. Next, we present new results on dissipative DM solitons, which occur in the context of mode-locked lasers. By means of numerical simulations and a reduced variational model of the complex Ginzburg-Landau equation, we analyze the influence of the different dissipative processes that take place in a laser.
Resumo:
There is an increasing call for applications which use a mixture of batteries. These hybrid battery solutions may contain different battery types for example; using second life ex-transportation batteries in grid support applications or a combination of high power, low energy and low power, high energy batteries to meet multiple energy requirements or even the same battery types but under different states of health for example, being able to hot swap out a battery when it has failed in an application without changing all the batteries and ending up with batteries with different performances, capacities and impedances. These types of applications typically use multi-modular converters to allow hot swapping to take place without affecting the overall performance of the system. A key element of the control is how the different battery performance characteristics may be taken into account and the how the power is then shared among the different batteries in line with their performance. This paper proposes a control strategy which allows the power in the batteries to be effectively distributed even under capacity fade conditions using adaptive power sharing strategy. This strategy is then validated against a system of three different battery types connected to a multi-modular converter both with and without capacity fade mechanisms in place.
Resumo:
Contemporary business environment involves IT being invested and shared by multiple stakeholders in collaborative, platform-based, and relational arrangements where the objective is to co-create value. Traditional IT enabled business value therefore has been extended towards IT value co-creation that involves multiple stakeholders. In this paper, we present a conceptual development of IT-based value co-creation in the context of online crowdsourcing. Based on the existing literature, we have distinguished multiple crowdsourcing types (models) by analyzing attributes of crowd, the roles of the client, the platform and the crowd that act as key stakeholders in the value co-creation process, and describe the major interactions between the main stakeholders. Our conceptual development is suggesting different combinations of value co-creation layers to be evident in different crowdsourcing models.
Resumo:
Progress on advanced active and passive photonic components that are required for high-speed optical communications over hollow-core photonic bandgap fiber at wavelengths around 2 μm is described in this paper. Single-frequency lasers capable of operating at 10 Gb/s and covering a wide spectral range are realized. A comparison is made between waveguide and surface normal photodiodes with the latter showing good sensitivity up to 15 Gb/s. Passive waveguides, 90° optical hybrids, and arrayed waveguide grating with 100-GHz channel spacing are demonstrated on a large spot-size waveguide platform. Finally, a strong electro-optic effect using the quantum confined Stark effect in strain-balanced multiple quantum wells is demonstrated and used in a Mach-Zehnder modulator capable of operating at 10 Gb/s.
Resumo:
Relay selection has been considered as an effective method to improve the performance of cooperative communication. However, the Channel State Information (CSI) used in relay selection can be outdated, yielding severe performance degradation of cooperative communication systems. In this paper, we investigate the relay selection under outdated CSI in a Decode-and-Forward (DF) cooperative system to improve its outage performance. We formulize an optimization problem, where the set of relays that forwards data is optimized to minimize the probability of outage conditioned on the outdated CSI of all the decodable relays’ links. We then propose a novel multiple-relay selection strategy based on the solution of the optimization problem. Simulation results show that the proposed relay selection strategy achieves large improvement of outage performance compared with the existing relay selection strategies combating outdated CSI given in the literature.
Resumo:
This paper examines whether the observed long memory behavior of log-range series is to some extent spurious and whether it can be explained by the presence of structural breaks. Utilizing stock market data we show that the characterization of log-range series as long memory processes can be a strong assumption. Moreover, we find that all examined series experience a large number of significant breaks. Once the breaks are accounted for, the volatility persistence is eliminated. Overall, the findings suggest that volatility can be adequately represented, at least in-sample, through a multiple breaks process and a short run component.
Resumo:
The realisation of an eventual low-voltage (LV) Smart Grid with a complete communication infrastructure is a gradual process. During this evolution the protection scheme of distribution networks should be continuously adapted and optimised to fit the protection and cost requirements at the time. This paper aims to review practices and research around the design of an effective, adaptive and economical distribution network protection scheme. The background of this topic is introduced and potential problems are defined from conventional protection theories and new Smart Grid technologies. Challenges are identified with possible solutions defined as a pathway to the ultimate flexible and reliable LV protection systems.