606 resultados para dynamic methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The railway service is now the major transportation means in most of the countries around the world. With the increasing population and expanding commercial and industrial activities, a high quality of railway service is the most desirable. Train service usually varies with the population activities throughout a day and train coordination and service regulation are then expected to meet the daily passengers' demand. Dwell time control at stations and fixed coasting point in an inter-station run are the current practices to regulate train service in most metro railway systems. However, a flexible and efficient train control and operation is not always possible. To minimize energy consumption of train operation and make certain compromises on the train schedule, coast control is an economical approach to balance run-time and energy consumption in railway operation if time is not an important issue, particularly at off-peak hours. The capability to identify the starting point for coasting according to the current traffic conditions provides the necessary flexibility for train operation. This paper presents an application of genetic algorithms (GA) to search for the appropriate coasting point(s) and investigates the possible improvement on fitness of genes. Single and multiple coasting point control with simple GA are developed to attain the solutions and their corresponding train movement is examined. Further, a hierarchical genetic algorithm (HGA) is introduced here to identify the number of coasting points required according to the traffic conditions, and Minimum-Allele-Reserve-Keeper (MARK) is adopted as a genetic operator to achieve fitter solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic conflicts at railway junctions are very conmon, particularly on congested rail lines. While safe passage through the junction is well maintained by the signalling and interlocking systems, minimising the delays imposed on the trains by assigning the right-of-way sequence sensibly is a bonus to the quality of service. A deterministic method has been adopted to resolve the conflict, with the objective of minimising the total weighted delay. However, the computational demand remains significant. The applications of different heuristic methods to tackle this problem are reviewed and explored, elaborating their feasibility in various aspects and comparing their relative merits for further studies. As most heuristic methods do not guarantee a global optimum, this study focuses on the trade-off between computation time and optimality of the resolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE. To measure tear film surface quality in healthy and dry eye subjects using three noninvasive techniques of tear film quality assessment and to establish the ability of these noninvasive techniques to predict dry eye. METHODS. Thirty four subjects participated in the study, and were classified as dry eye or normal, based on standard clinical assessments. Three non-invasive techniques were applied for measurement of tear film surface quality: dynamic-area high-speed videokeratoscopy (HSV), wavefront sensing (DWS) and lateral shearing interferometry (LSI). The measurements were performed in both natural blinking conditions (NBC) and in suppressed blinking conditions (SBC). RESULTS. In order to investigate the capability of each method to discriminate dry eye subjects from normal subjects, the receiver operating curve (ROC) was calculated and then the area under the curve (AUC) was extracted. The best result was obtained for the LSI technique (AUC=0.80 in SBC and AUC=0.73 in NBC), which was followed by HSV (AUC=0.72 in SBC and AUC=0.71 in NBC). The best result for DWS was AUC=0.64 obtained for changes in vertical coma in suppressed blinking conditions, while for normal blinking conditions the results were poorer. CONCLUSIONS. Non-invasive techniques of tear film surface assessment can be used for predicting dry eye and this can be achieved in natural blinking as well as suppressed blinking conditions. In this study, LSI showed the best detection performance, closely followed by the dynamic-area HSV. The wavefront sensing technique was less powerful, particularly in natural blinking conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are several noninvasive techniques for assessing the kinetics of tear film, but no comparative studies have been conducted to evaluate their efficacies. Our aim is to test and compare techniques based on high-speed videokeratoscopy (HSV), dynamic wavefront sensing (DWS), and lateral shearing interferometry (LSI). Algorithms are developed to estimate the tear film build-up time TBLD, and the average tear film surface quality in the stable phase of the interblink interval TFSQAv. Moderate but significant correlations are found between TBLD measured with LSI and DWS based on vertical coma (Pearson's r2=0.34, p<0.01) and higher order rms (r2=0.31, p<0.01), as well as between TFSQAv measured with LSI and HSV (r2=0.35, p<0.01), and between LSI and DWS based on the rms fit error (r2=0.40, p<0.01). No significant correlation is found between HSV and DWS. All three techniques estimate tear film build-up time to be below 2.5 sec, and they achieve a remarkably close median value of 0.7 sec. HSV appears to be the most precise method for measuring tear film surface quality. LSI appears to be the most sensitive method for analyzing tear film build-up.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We advance the proposition that dynamic stochastic general equilibrium (DSGE) models should not only be estimated and evaluated with full information methods. These require that the complete system of equations be specified properly. Some limited information analysis, which focuses upon specific equations, is therefore likely to be a useful complement to full system analysis. Two major problems occur when implementing limited information methods. These are the presence of forward-looking expectations in the system as well as unobservable non-stationary variables. We present methods for dealing with both of these difficulties, and illustrate the interaction between full and limited information methods using a well-known model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid growth of mobile telephone use, satellite services, and now the wireless Internet and WLANs are generating tremendous changes in telecommunication and networking. As indoor wireless communications become more prevalent, modeling indoor radio wave propagation in populated environments is a topic of significant interest. Wireless MIMO communication exploits phenomena such as multipath propagation to increase data throughput and range, or reduce bit error rates, rather than attempting to eliminate effects of multipath propagation as traditional SISO communication systems seek to do. The MIMO approach can yield significant gains for both link and network capacities, with no additional transmitting power or bandwidth consumption when compared to conventional single-array diversity methods. When MIMO and OFDM systems are combined and deployed in a suitable rich scattering environment such as indoors, a significant capacity gain can be observed due to the assurance of multipath propagation. Channel variations can occur as a result of movement of personnel, industrial machinery, vehicles and other equipment moving within the indoor environment. The time-varying effects on the propagation channel in populated indoor environments depend on the different pedestrian traffic conditions and the particular type of environment considered. A systematic measurement campaign to study pedestrian movement effects in indoor MIMO-OFDM channels has not yet been fully undertaken. Measuring channel variations caused by the relative positioning of pedestrians is essential in the study of indoor MIMO-OFDM broadband wireless networks. Theoretically, due to high multipath scattering, an increase in MIMO-OFDM channel capacity is expected when pedestrians are present. However, measurements indicate that some reductions in channel capacity could be observed as the number of pedestrians approaches 10 due to a reduction in multipath conditions as more human bodies absorb the wireless signals. This dissertation presents a systematic characterization of the effects of pedestrians in indoor MIMO-OFDM channels. Measurement results, using the MIMO-OFDM channel sounder developed at the CSIRO ICT Centre, have been validated by a customized Geometric Optics-based ray tracing simulation. Based on measured and simulated MIMO-OFDM channel capacity and MIMO-OFDM capacity dynamic range, an improved deterministic model for MIMO-OFDM channels in indoor populated environments is presented. The model can be used for the design and analysis of future WLAN to be deployed in indoor environments. The results obtained show that, in both Fixed SNR and Fixed Tx for deterministic condition, the channel capacity dynamic range rose with the number of pedestrians as well as with the number of antenna combinations. In random scenarios with 10 pedestrians, an increment in channel capacity of up to 0.89 bits/sec/Hz in Fixed SNR and up to 1.52 bits/sec/Hz in Fixed Tx has been recorded compared to the one pedestrian scenario. In addition, from the results a maximum increase in average channel capacity of 49% has been measured while 4 antenna elements are used, compared with 2 antenna elements. The highest measured average capacity, 11.75 bits/sec/Hz, corresponds to the 4x4 array with 10 pedestrians moving randomly. Moreover, Additionally, the spread between the highest and lowest value of the the dynamic range is larger for Fixed Tx, predicted 5.5 bits/sec/Hz and measured 1.5 bits/sec/Hz, in comparison with Fixed SNR criteria, predicted 1.5 bits/sec/Hz and measured 0.7 bits/sec/Hz. This has been confirmed by both measurements and simulations ranging from 1 to 5, 7 and 10 pedestrians.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study is to contribute to the cross-disciplinary body of literature of identity and organisational culture. This study empirically investigated the Hatch and Schultz (2002) Organisational Identity Dynamics (OID) model to look at linkages between identity, image, and organisational culture. This study used processes defined in the OID model as a theoretical frame by which to understand the relationships between actual and espoused identity manifestations across visual identity, corporate identity, and organisational identity. The linking processes of impressing, mirroring, reflecting, and expressing were discussed at three unique levels in the organisation. The overarching research question of How does the organisational identity dynamics process manifest itself in practice at different levels within an organisation? was used as a means of providing empirical understanding to the previously theoretical OID model. Case study analysis was utilised to provide exploratory data across the organisational groups of: Level A - Senior Marketing and Corporate Communications Management, Level B - Marketing and Corporate Communications Staff, and Level C - Non-Marketing Managers and Employees. Data was collected via 15 in-depth interviews with documentary analysis used as a supporting mechanism to provide triangulation in analysis. Data was analysed against the impressing, mirroring, reflecting, and expressing constructs with specific criteria developed from literature to provide a detailed analysis of each process. Conclusions revealed marked differences in the ways in which OID processes occurred across different levels with implications for the ways in which VI, CI, and OI interact to develop holistic identity across organisational levels. Implications for theory detail the need to understand and utilise cultural understanding in identity programs as well as the value in developing identity communications which represent an actual rather than an espoused position.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose - This paper seeks to examine the complex relationships between urban planning, infrastructure management, sustainable urban development, and to illustrate why there is an urgent need for local governments to develop a robust planning support system which integrates with advance urban computer modelling tools to facilitate better infrastructure management and improve knowledge sharing between the community, urban planners, engineers and decision makers. Design/methodology/approach - The methods used in this paper includes literature review and practical project case observations. Originality/value - This paper provides an insight of how the Brisbane's planning support system established by Brisbane City Council has significantly improved the effectiveness of urban planning, infrastructure management and community engagement through better knowledge management processes. Practical implications - This paper presents a practical framework for setting up a functional planning support system within local government. The integration of the Brisbane Urban Growth model, Virtual Brisbane and the Brisbane Economic Activity Monitoring (BEAM) database have proven initially successful to provide a dynamic platform to assist elected officials, planners and engineers to understand the limitations of the local environment, its urban systems and the planning implications on a city. With the Brisbane's planning support system, planners and decision makers are able to provide better planning outcomes, policy and infrastructure that adequately address the local needs and achieve sustainable spatial forms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since its initial proposal in 1998, alkaline hydrothermal processing has rapidly become an established technology for the production of titanate nanostructures. This simple, highly reproducible process has gained a strong research following since its conception. However, complete understanding and elucidation of nanostructure phase and formation have not yet been achieved. Without fully understanding phase, formation, and other important competing effects of the synthesis parameters on the final structure, the maximum potential of these nanostructures cannot be obtained. Therefore this study examined the influence of synthesis parameters on the formation of titanate nanostructures produced by alkaline hydrothermal treatment. The parameters included alkaline concentration, hydrothermal temperature, the precursor material‘s crystallite size and also the phase of the titanium dioxide precursor (TiO2, or titania). The nanostructure‘s phase and morphology was analysed using X-ray diffraction (XRD), Raman spectroscopy and transmission electron microscopy. X-ray photoelectron spectroscopy (XPS), dynamic light scattering (non-invasive backscattering), nitrogen sorption, and Rietveld analysis were used to determine phase, for particle sizing, surface area determinations, and establishing phase concentrations, respectively. This project rigorously examined the effect of alkaline concentration and hydrothermal temperature on three commercially sourced and two self-prepared TiO2 powders. These precursors consisted of both pure- or mixed-phase anatase and rutile polymorphs, and were selected to cover a range of phase concentrations and crystallite sizes. Typically, these precursors were treated with 5–10 M sodium hydroxide (NaOH) solutions at temperatures between 100–220 °C. Both nanotube and nanoribbon morphologies could be produced depending on the combination of these hydrothermal conditions. Both titania and titanate phases are comprised of TiO6 units which are assembled in different combinations. The arrangement of these atoms affects the binding energy between the Ti–O bonds. Raman spectroscopy and XPS were therefore employed in a preliminary study of phase determination for these materials. The change in binding energy from a titania to a titanate binding energy was investigated in this study, and the transformation of titania precursor into nanotubes and titanate nanoribbons was directly observed by these methods. Evaluation of the Raman and XPS results indicated a strengthening in the binding energies of both the Ti (2p3/2) and O (1s) bands which correlated to an increase in strength and decrease in resolution of the characteristic nanotube doublet observed between 320 and 220 cm.1 in the Raman spectra of these products. The effect of phase and crystallite size on nanotube formation was examined over a series of temperatures (100.200 �‹C in 20 �‹C increments) at a set alkaline concentration (7.5 M NaOH). These parameters were investigated by employing both pure- and mixed- phase precursors of anatase and rutile. This study indicated that both the crystallite size and phase affect nanotube formation, with rutile requiring a greater driving force (essentially �\harsher. hydrothermal conditions) than anatase to form nanotubes, where larger crystallites forms of the precursor also appeared to impede nanotube formation slightly. These parameters were further examined in later studies. The influence of alkaline concentration and hydrothermal temperature were systematically examined for the transformation of Degussa P25 into nanotubes and nanoribbons, and exact conditions for nanostructure synthesis were determined. Correlation of these data sets resulted in the construction of a morphological phase diagram, which is an effective reference for nanostructure formation. This morphological phase diagram effectively provides a .recipe book�e for the formation of titanate nanostructures. Morphological phase diagrams were also constructed for larger, near phase-pure anatase and rutile precursors, to further investigate the influence of hydrothermal reaction parameters on the formation of titanate nanotubes and nanoribbons. The effects of alkaline concentration, hydrothermal temperature, crystallite phase and size are observed when the three morphological phase diagrams are compared. Through the analysis of these results it was determined that alkaline concentration and hydrothermal temperature affect nanotube and nanoribbon formation independently through a complex relationship, where nanotubes are primarily affected by temperature, whilst nanoribbons are strongly influenced by alkaline concentration. Crystallite size and phase also affected the nanostructure formation. Smaller precursor crystallites formed nanostructures at reduced hydrothermal temperature, and rutile displayed a slower rate of precursor consumption compared to anatase, with incomplete conversion observed for most hydrothermal conditions. The incomplete conversion of rutile into nanotubes was examined in detail in the final study. This study selectively examined the kinetics of precursor dissolution in order to understand why rutile incompletely converted. This was achieved by selecting a single hydrothermal condition (9 M NaOH, 160 °C) where nanotubes are known to form from both anatase and rutile, where the synthesis was quenched after 2, 4, 8, 16 and 32 hours. The influence of precursor phase on nanostructure formation was explicitly determined to be due to different dissolution kinetics; where anatase exhibited zero-order dissolution and rutile second-order. This difference in kinetic order cannot be simply explained by the variation in crystallite size, as the inherent surface areas of the two precursors were determined to have first-order relationships with time. Therefore, the crystallite size (and inherent surface area) does not affect the overall kinetic order of dissolution; rather, it determines the rate of reaction. Finally, nanostructure formation was found to be controlled by the availability of dissolved titanium (Ti4+) species in solution, which is mediated by the dissolution kinetics of the precursor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seven endemic governance problems are shown to be currently present in governments around the globe and at any level of government as well (for example municipal, federal). These problems have their roots traced back through more than two thousand years of political, specifically ‘democratic’, history. The evidence shows that accountability, transparency, corruption, representation, campaigning methods, constitutionalism and long-term goals were problematic for the ancient Athenians as well as modern international democratisation efforts encompassing every major global region. Why then, given the extended time period humans have had to deal with these problems, are they still present? At least part of the answer to this question is that philosophers, academics and NGOs as well as MNOs have only approached these endemic problems in a piecemeal manner with a skewed perspective on democracy. Their works have also been subject to the ebbs and flows of human history which essentially started and stopped periods of thinking. In order to approach the investigation of endemic problems in relation to democracy (as the overall quest of this thesis was to generate prescriptive results for the improvement of democratic government), it was necessary to delineate what exactly is being written about when using the term ‘democracy’. It is common knowledge that democracy has no one specific definition or practice, even though scholars and philosophers have been attempting to create a definition for generations. What is currently evident, is that scholars are not approaching democracy in an overly simplified manner (that is, it is government for the people, by the people) but, rather, are seeking the commonalities that democracies share, in other words, those items which are common to all things democratic. Following that specific line of investigation, the major practiced and theoretical versions of democracy were thematically analysed. After that, their themes were collapsed into larger categories, at which point the larger categories were comparatively analysed with the practiced and theoretical versions of democracy. Four democratic ‘particles’ (selecting officials, law, equality and communication) were seen to be present in all practiced and theoretical democratic styles. The democratic particles fused with a unique investigative perspective and in-depth political study created a solid conceptualisation of democracy. As such, it is argued that democracy is an ever-present element of any state government, ‘democratic’ or not, and the particles are the bodies which comprise the democratic element. Frequency- and proximity-based analyses showed that democratic particles are related to endemic problems in international democratisation discourse. The linkages between democratic particles and endemic problems were also evident during the thematic analysis as well historical review. This ultimately led to the viewpoint that if endemic problems are mitigated the act may improve democratic particles which might strengthen the element of democracy in the governing apparatus of any state. Such may actively minimise or wholly displace inefficient forms of government, leading to a government specifically tailored to the population it orders. Once the theoretical and empirical goals were attained, this thesis provided some prescriptive measures which government, civil society, academics, professionals and/or active citizens can use to mitigate endemic problems (in any country and at any level of government) so as to improve the human condition via better democratic government.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Training designed to support and strengthen higher-order mental abilities now often involves immersion in Virtual Reality where dangerous real world scenarios can be safely replicated. However despite the growing popularity of advanced training simulations, methods for evaluating their use rely heavily on subjective measures or analysis of final outcomes. Without dynamic, objective performance measures the outcome of training in terms of impact on cognitive skills and ability to transfer newly acquired skills to the real world is unknown. The relationship between affective intensity and cognitive learning provides a potential new approach to ensure the processing of cognitions which occur prior to final outcomes, such as problem-solving and decision-making, are adequately evaluated. This paper describes the technical aspects of pilot work recently undertaken to develop a new measurement tool designed to objectively track individual affect levels during simulation-based training.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the first time in human history, large volumes of spoken audio are being broadcast, made available on the internet, archived, and monitored for surveillance every day. New technologies are urgently required to unlock these vast and powerful stores of information. Spoken Term Detection (STD) systems provide access to speech collections by detecting individual occurrences of specified search terms. The aim of this work is to develop improved STD solutions based on phonetic indexing. In particular, this work aims to develop phonetic STD systems for applications that require open-vocabulary search, fast indexing and search speeds, and accurate term detection. Within this scope, novel contributions are made within two research themes, that is, accommodating phone recognition errors and, secondly, modelling uncertainty with probabilistic scores. A state-of-the-art Dynamic Match Lattice Spotting (DMLS) system is used to address the problem of accommodating phone recognition errors with approximate phone sequence matching. Extensive experimentation on the use of DMLS is carried out and a number of novel enhancements are developed that provide for faster indexing, faster search, and improved accuracy. Firstly, a novel comparison of methods for deriving a phone error cost model is presented to improve STD accuracy, resulting in up to a 33% improvement in the Figure of Merit. A method is also presented for drastically increasing the speed of DMLS search by at least an order of magnitude with no loss in search accuracy. An investigation is then presented of the effects of increasing indexing speed for DMLS, by using simpler modelling during phone decoding, with results highlighting the trade-off between indexing speed, search speed and search accuracy. The Figure of Merit is further improved by up to 25% using a novel proposal to utilise word-level language modelling during DMLS indexing. Analysis shows that this use of language modelling can, however, be unhelpful or even disadvantageous for terms with a very low language model probability. The DMLS approach to STD involves generating an index of phone sequences using phone recognition. An alternative approach to phonetic STD is also investigated that instead indexes probabilistic acoustic scores in the form of a posterior-feature matrix. A state-of-the-art system is described and its use for STD is explored through several experiments on spontaneous conversational telephone speech. A novel technique and framework is proposed for discriminatively training such a system to directly maximise the Figure of Merit. This results in a 13% improvement in the Figure of Merit on held-out data. The framework is also found to be particularly useful for index compression in conjunction with the proposed optimisation technique, providing for a substantial index compression factor in addition to an overall gain in the Figure of Merit. These contributions significantly advance the state-of-the-art in phonetic STD, by improving the utility of such systems in a wide range of applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper seeks to link anthropological and economic treatments of the process of innovation and change, not only within a given ‘complex system’ (e.g. a cosmology; an industry) but also between systems (e.g. cultural and economic systems; but also divine and human systems). The role of the ‘Go-Between’ is considered, both in the anthropological figure of the Trickster (Hyde 1998) and in the Schumpeterian entrepreneur. Both figures parlay appetite (economic wants) into meaning (cultural signs). Both practice a form of creativity based on deception, ‘creative destruction’; renewal by disruption and needs-must adaptation. The disciplinary purpose of the paper is to try to bridge two otherwise disconnected domains – cultural studies and evolutionary economics – by showing that the traditional methods of the humanities (e.g. anthropological, textual and historical analysis) have explanatory force in the context of economic actions and complex-system evolutionary dynamics. The objective is to understand creative innovation as a general cultural attribute rather than one restricted only to accredited experts such as artists; thus to theorise creativity as a form of emergence for dynamic adaptive systems. In this context, change is led by ‘paradigm shifters’ – tricksters and entrepreneurs who create new meanings out of the clash of difference, including the clash of mutually untranslatable communication systems (language, media, culture).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tungro is one of the most destructive viral diseases of rice in South and Southeast Asia. It is associated with two viruses---rice tungro bacilliform virus (RTBV) ,and rice tungro spherical virus (RTSV) (Hibino et al 1978). Both viruses are transmitted by the green leafhopper (GLH) Nephotettix virescens (Ling 1979), However, prior acquisition of RTSV is required for Ihe transmission of RTBV alone (Hibino 1983). Plants infected with both viruses show severe stunting and yellowing. Those infected with RTBV alone show mild stunting but no leaf discoloration whereas those infected with RTSV alone do not show any apparent symptoms (Hibino el al 1978). Since the late 1960s, tungro has been mainly managed through varietal resistance (Khush 1989). The instability of resistant varieties in the field (Dahal et .a1 1990) led to a reexamination of the nature of the incorporated sources of resistance and to the adoption of more precise and more accurate screening methods.