719 resultados para long-period grating
Resumo:
Although the branding literature commenced during the 1940s, the first publications related to destination branding did not emerge until half a century later. A review of 74 destination branding publications by 102 authors from the first 10 years of destination branding literature (1998-2007) found at least nine potential research gaps warranting attention by researchers. In particular, there has been a lack of research examining the extent to which brand positioning campaigns have been successful in enhancing brand equity in the manner intended in the brand identity. The purpose of this paper is to report the results of an investigation of brand equity tracking for a competitive set of destinations in Queensland, Australia between 2003 and 2007. A hierarchy of consumer-based brand equity (CBBE) provided an effective means to monitor destination brand positions over time. A key implication of the results was the finding that there was no change in brand positions for any of the five destinations over the four year period. This leads to the proposition that destination position change within a competitive set will only occur slowly over a long period of time. The tabulation of 74 destination branding case studies, research papers, conceptual papers and web content analyses provides students and researchers with a useful resource on the current state of the field.
Resumo:
This paper presents an Airborne Systems Laboratory for Automation Research. The Airborne Systems Laboratory (ASL) is a Cessna 172 aircraft that has been specially modified and equipped by ARCAA specifically for research in future aircraft automation technologies, including Unmanned Airborne Systems (UAS). This capability has been developed over a long period of time, initially through the hire of aircraft, and finally through the purchase and modification of a dedicated flight-testing capability. The ASL has been equipped with a payload system that includes the provision of secure mounting, power, aircraft state data, flight management system and real-time subsystem. Finally, this system has been deployed in a cost effective platform allowing real-world flight-testing on a range of projects.
Resumo:
The ad hoc networks are vulnerable to attacks due to distributed nature and lack of infrastructure. Intrusion detection systems (IDS) provide audit and monitoring capabilities that offer the local security to a node and help to perceive the specific trust level of other nodes. The clustering protocols can be taken as an additional advantage in these processing constrained networks to collaboratively detect intrusions with less power usage and minimal overhead. Existing clustering protocols are not suitable for intrusion detection purposes, because they are linked with the routes. The route establishment and route renewal affects the clusters and as a consequence, the processing and traffic overhead increases due to instability of clusters. The ad hoc networks are battery and power constraint, and therefore a trusted monitoring node should be available to detect and respond against intrusions in time. This can be achieved only if the clusters are stable for a long period of time. If the clusters are regularly changed due to routes, the intrusion detection will not prove to be effective. Therefore, a generalized clustering algorithm has been proposed that can run on top of any routing protocol and can monitor the intrusions constantly irrespective of the routes. The proposed simplified clustering scheme has been used to detect intrusions, resulting in high detection rates and low processing and memory overhead irrespective of the routes, connections, traffic types and mobility of nodes in the network. Clustering is also useful to detect intrusions collaboratively since an individual node can neither detect the malicious node alone nor it can take action against that node on its own.
Resumo:
Many infrastructure and necessity systems such as electricity and telecommunication in Europe and the Northern America were used to be operated as monopolies, if not state-owned. However, they have now been disintegrated into a group of smaller companies managed by different stakeholders. Railways are no exceptions. Since the early 1980s, there have been reforms in the shape of restructuring of the national railways in different parts of the world. Continuous refinements are still conducted to allow better utilisation of railway resources and quality of service. There has been a growing interest for the industry to understand the impacts of these reforms on the operation efficiency and constraints. A number of post-evaluations have been conducted by analysing the performance of the stakeholders on their profits (Crompton and Jupe 2003), quality of train service (Shaw 2001) and engineering operations (Watson 2001). Results from these studies are valuable for future improvement in the system, followed by a new cycle of post-evaluations. However, direct implementation of these changes is often costly and the consequences take a long period of time (e.g. years) to surface. With the advance of fast computing technologies, computer simulation is a cost-effective means to evaluate a hypothetical change in a system prior to actual implementation. For example, simulation suites have been developed to study a variety of traffic control strategies according to sophisticated models of train dynamics, traction and power systems (Goodman, Siu and Ho 1998, Ho and Yeung 2001). Unfortunately, under the restructured railway environment, it is by no means easy to model the complex behaviour of the stakeholders and the interactions between them. Multi-agent system (MAS) is a recently developed modelling technique which may be useful in assisting the railway industry to conduct simulations on the restructured railway system. In MAS, a real-world entity is modelled as a software agent that is autonomous, reactive to changes, able to initiate proactive actions and social communicative acts. It has been applied in the areas of supply-chain management processes (García-Flores, Wang and Goltz 2000, Jennings et al. 2000a, b) and e-commerce activities (Au, Ngai and Parameswaran 2003, Liu and You 2003), in which the objectives and behaviour of the buyers and sellers are captured by software agents. It is therefore beneficial to investigate the suitability or feasibility of applying agent modelling in railways and the extent to which it might help in developing better resource management strategies. This paper sets out to examine the benefits of using MAS to model the resource management process in railways. Section 2 first describes the business environment after the railway 2 Modelling issues on the railway resource management process using MAS reforms. Then the problems emerge from the restructuring process are identified in section 3. Section 4 describes the realisation of a MAS for railway resource management under the restructured scheme and the feasible studies expected from the model.
Resumo:
The field of literacy studies has always been challenged by the changing technologies that humans have used to express, represent and communicate their feelings, ideas, understandings and knowledge. However, while the written word has remained central to literacy processes over a long period, it is generally accepted that there have been significant changes to what constitutes ‘literate’ practice. In particular, the status of the printed word has been challenged by the increasing dominance of the image, along with the multimodal meaning-making systems facilitated by digital media. For example, Gunther Kress and other members of the New London Group have argued that the second half of the twentieth century saw a significant cultural shift from the linguistic to the visual as the dominant semiotic mode. This in turn, they suggest, was accompanied by a cultural shift ‘from page to screen’ as a dominant space of representation (e.g. Cope & Kalantzis, 2000; Kress, 2003; New London Group, 1996). In a similar vein, Bill Green has noted that we have witnessed a shift from the regime of the print apparatus to a regime of the digital electronic apparatus (Lankshear, Snyder and Green, 2000). For these reasons, the field of literacy education has been challenged to find new ways to conceptualise what is meant by ‘literacy’ in the twenty first century and to rethink the conditions under which children might best be taught to be fully literate so that they can operate with agency in today’s world.
Resumo:
The flood flow in urbanised areas constitutes a major hazard to the population and infrastructure as seen during the summer 2010-2011 floods in Queensland (Australia). Flood flows in urban environments have been studied relatively recently, although no study considered the impact of turbulence in the flow. During the 12-13 January 2011 flood of the Brisbane River, some turbulence measurements were conducted in an inundated urban environment in Gardens Point Road next to Brisbane's central business district (CBD) at relatively high frequency (50 Hz). The properties of the sediment flood deposits were characterised and the acoustic Doppler velocimeter unit was calibrated to obtain both instantaneous velocity components and suspended sediment concentration in the same sampling volume with the same temporal resolution. While the flow motion in Gardens Point Road was subcritical, the water elevations and velocities fluctuated with a distinctive period between 50 and 80 s. The low frequency fluctuations were linked with some local topographic effects: i.e, some local choke induced by an upstream constriction between stairwells caused some slow oscillations with a period close to the natural sloshing period of the car park. The instantaneous velocity data were analysed using a triple decomposition, and the same triple decomposition was applied to the water depth, velocity flux, suspended sediment concentration and suspended sediment flux data. The velocity fluctuation data showed a large energy component in the slow fluctuation range. For the first two tests at z = 0.35 m, the turbulence data suggested some isotropy. At z = 0.083 m, on the other hand, the findings indicated some flow anisotropy. The suspended sediment concentration (SSC) data presented a general trend with increasing SSC for decreasing water depth. During a test (T4), some long -period oscillations were observed with a period about 18 minutes. The cause of these oscillations remains unknown to the authors. The last test (T5) took place in very shallow waters and high suspended sediment concentrations. It is suggested that the flow in the car park was disconnected from the main channel. Overall the flow conditions at the sampling sites corresponded to a specific momentum between 0.2 to 0.4 m2 which would be near the upper end of the scale for safe evacuation of individuals in flooded areas. But the authors do not believe the evacuation of individuals in Gardens Point Road would have been safe because of the intense water surges and flow turbulence. More generally any criterion for safe evacuation solely based upon the flow velocity, water depth or specific momentum cannot account for the hazards caused by the flow turbulence, water depth fluctuations and water surges.
Resumo:
The development of highway infrastructure typically requires major capital input over a long period. This often causes serious financial constraints for investors. The push for sustainability has added new dimensions to the complexity in the evaluation of highway projects, particularly on the cost front. This makes the determination of long-term viability even more a precarious exercise. Life-cycle costing analysis (LCCA) is generally recognised as a valuable tool for the assessment of financial decisions on construction works. However to date, existing LCCA models are deficient in dealing with sustainability factors, particularly for infrastructure projects due to their inherent focus on the economic issues alone. This research probed into the major challenges of implementing sustainability in highway infrastructure development in terms of financial concerns and obligations. Using results of research through literature review, questionnaire survey of industry stakeholders and semi-structured interview of senior practitioners involved in highway infrastructure development, the research identified the relative importance of cost components relating to sustainability measures and on such basis, developed ways of improving existing LCCA models to incorporate sustainability commitments into long-term financial management. On such a platform, a decision support model incorporated Fuzzy Analytical Hierarchy Process and LCCA for the evaluation of the specific cost components most concerned by infrastructure stakeholders. Two real highway infrastructure projects in Australia were then used for testing, application and validation, before the decision support model was finalised. Improved industry understanding and tools such as the developed model will lead to positive sustainability deliverables while ensuring financial viability over the lifecycle of highway infrastructure projects.
Resumo:
During a major flood event, the inundation of urban environments leads to some complicated flow motion most often associated with significant sediment fluxes. In the present study, a series of field measurements were conducted in an inundated section of the City of Brisbane (Australia) about the peak of a major flood in January 2011. Some experiments were performed to use ADV backscatter amplitude as a surrogate estimate of the suspended sediment concentration (SSC) during the flood event. The flood water deposit samples were predominantly silty material with a median particle size about 25 μm and they exhibited a non-Newtonian behavior under rheological testing. In the inundated urban environment during the flood, estimates of suspended sediment concentration presented a general trend with increasing SSC for decreasing water depth. The suspended sediment flux data showed some substantial sediment flux amplitudes consistent with the murky appearance of floodwaters. Altogether the results highlighted the large suspended sediment loads and fluctuations in the inundated urban setting associated possibly with a non-Newtonian behavior. During the receding flood, some unusual long-period oscillations were observed (periods about 18 min), although the cause of these oscillations remains unknown. The field deployment was conducted in challenging conditions highlighting a number of practical issues during a natural disaster.
Resumo:
Big Data is a rising IT trend similar to cloud computing, social networking or ubiquitous computing. Big Data can offer beneficial scenarios in the e-health arena. However, one of the scenarios can be that Big Data needs to be kept secured for a long period of time in order to gain its benefits such as finding cures for infectious diseases and protecting patient privacy. From this connection, it is beneficial to analyse Big Data to make meaningful information while the data is stored securely. Therefore, the analysis of various database encryption techniques is essential. In this study, we simulated 3 types of technical environments, namely, Plain-text, Microsoft Built-in Encryption, and custom Advanced Encryption Standard, using Bucket Index in Data-as-a-Service. The results showed that custom AES-DaaS has a faster range query response time than MS built-in encryption. Furthermore, while carrying out the scalability test, we acknowledged that there are performance thresholds depending on physical IT resources. Therefore, for the purpose of efficient Big Data management in eHealth it is noteworthy to examine their scalability limits as well even if it is under a cloud computing environment. In addition, when designing an e-health database, both patient privacy and system performance needs to be dealt as top priorities.
Resumo:
Most existing marinas are boat parking/storing and servicing facilities that have been built over a long period of time for the convenience of local boat owners.
Resumo:
A collection of short works which are the result of a year-long period of movement reflection and investigation by Liz Roche with a core group of Rex Levitates dancers. The works appear as a wash of colour, movement, rhythm and emotion; human embodiment as a dynamic event. The audience completes the picture. A picture lives by companionship, expanding and quickening in the eyes of the sensitive observer - Mark Rothko (Painter, 1903 - 1970)."
Resumo:
This chapter sets out to identify patterns at play in boardroom discussions around the design and adoption of an accountability system in a nonprofit organisation. To this end, it contributes to the scarce literature showing the backstage of management accounting systems (Berry, 2005), investment policy determining (Kreander, Beattie & McPhail, 2009; Kreander, McPhail & Molyneaux, 2004) and financial planning strategizing (Parker, 2004) or budgeting (Irvine 2005). The paucity of publications is due to issues raised by confidentiality preventing attendance at those meetings (Irvine, 2003), Irvine & Gaffikin, 2006). However, often, the implementation of a new control technology occurs over a long period of time that might exceed the duration of a research project (Quattrone & Hopper, 2001, 2005). Recent trends consisting of having research funded by grants from private institutions or charities have tended to reduce the length of such undertakings to a few months or rarely more than a couple of years (Parker, 2013);
Resumo:
A long-period magnetotelluric (MT) survey, with 39 sites covering an area of 270 by 150 km, has identified melt within the thinned lithosphere of Pleistocene-Holocene Newer Volcanics Province (NVP) in southeast Australia, which has been variously attributed to mantle plume activity or edge-driven mantle convection. Two-dimensional inversions from the MT array imaged a low-resistivity anomaly (10-30Ωm) beneath the NVP at ∼40-80 km depth, which is consistent with the presence of ∼1.5-4% partial melt in the lithosphere, but inconsistent with elevated iron content, metasomatism products or a hot spot. The conductive zone is located within thin juvenile oceanic mantle lithosphere, which was accreted onto thicker Proterozoic continental mantle lithosphere. We propose that the NVP owes its origin to decompression melting within the asthenosphere, promoted by lithospheric thickness variations in conjunction with rapid shear, where asthenospheric material is drawn by shear flow at a "step" at the base of the lithosphere.
Resumo:
Research over a long period of time has continued to demonstrate problems in the teaching of science in school. In addition, declining levels of participation and interest in science and related fields have been reported from many particularly western countries. Among the strategies suggested is the recruitment of professional scientists and technologists either at the graduate level or advanced career level to change career and teach. In this study, we analysed how one beginning middle primary teacher engaged with students to support their science learning by establishing rich classroom discussions. We followed his evolving teaching expertise over three years focussing on his communicative practices informed by socio-cultural theory. His practices exemplified a non-interactive dialogical communicative approach where ideas were readily discussed but were concentrated on the class acquiring acceptable scientific understandings. His focus on the language of science was a significant aspect of his practice and one that emerged from his professional background. The study affirms the theoretical frameworks proposed by Mortimer and Scott (2003) highlighting how dialogue contributes to heightened student interest in science.
Resumo:
This paper presents the site classification of Bangalore Mahanagar Palike (BMP) area using geophysical data and the evaluation of spectral acceleration at ground level using probabilistic approach. Site classification has been carried out using experimental data from the shallow geophysical method of Multichannel Analysis of Surface wave (MASW). One-dimensional (1-D) MASW survey has been carried out at 58 locations and respective velocity profiles are obtained. The average shear wave velocity for 30 m depth (Vs(30)) has been calculated and is used for the site classification of the BMP area as per NEHRP (National Earthquake Hazards Reduction Program). Based on the Vs(30) values major part of the BMP area can be classified as ``site class D'', and ``site class C'. A smaller portion of the study area, in and around Lalbagh Park, is classified as ``site class B''. Further, probabilistic seismic hazard analysis has been carried out to map the seismic hazard in terms spectral acceleration (S-a) at rock and the ground level considering the site classes and six seismogenic sources identified. The mean annual rate of exceedance and cumulative probability hazard curve for S. have been generated. The quantified hazard values in terms of spectral acceleration for short period and long period are mapped for rock, site class C and D with 10% probability of exceedance in 50 years on a grid size of 0.5 km. In addition to this, the Uniform Hazard Response Spectrum (UHRS) at surface level has been developed for the 5% damping and 10% probability of exceedance in 50 years for rock, site class C and D These spectral acceleration and uniform hazard spectrums can be used to assess the design force for important structures and also to develop the design spectrum.