103 resultados para Set of Weak Stationary Dynamic Actions
Resumo:
This paper presents a new approach to improving the effectiveness of autonomous systems that deal with dynamic environments. The basis of the approach is to find repeating patterns of behavior in the dynamic elements of the system, and then to use predictions of the repeating elements to better plan goal directed behavior. It is a layered approach involving classifying, modeling, predicting and exploiting. Classifying involves using observations to place the moving elements into previously defined classes. Modeling involves recording features of the behavior on a coarse grained grid. Exploitation is achieved by integrating predictions from the model into the behavior selection module to improve the utility of the robot's actions. This is in contrast to typical approaches that use the model to select between different strategies or plays. Three methods of adaptation to the dynamic features of the environment are explored. The effectiveness of each method is determined using statistical tests over a number of repeated experiments. The work is presented in the context of predicting opponent behavior in the highly dynamic and multi-agent robot soccer domain (RoboCup).
Resumo:
Service bundling can be regarded as an option for service providers to strengthen their competitive advantages, cope with dynamic market conditions and heterogeneous consumer demand. Despite these positive effects, actual guidance for the identification of service bundles and the act of bundling itself can be regarded as a gap. Previous research has resulted in a conceptualization of a service bundling method relying on a structured service description in order to fill this gap. This method addresses the reasoning about the suitability of services to be part of a bundle based on analyzing existing relationships between services captured by a description language. This paper extends the aforementioned research by presenting an initial set of empirically derived relationships between services in existing bundles that can subsequently be utilized to identify potential new bundles. Additionally, a gap analysis points out to what extent prominent ontologies and service description languages accommodate for the identified relationships.
Resumo:
To date, most applications of algebraic analysis and attacks on stream ciphers are on those based on lin- ear feedback shift registers (LFSRs). In this paper, we extend algebraic analysis to non-LFSR based stream ciphers. Specifically, we perform an algebraic analysis on the RC4 family of stream ciphers, an example of stream ciphers based on dynamic tables, and inves- tigate its implications to potential algebraic attacks on the cipher. This is, to our knowledge, the first pa- per that evaluates the security of RC4 against alge- braic attacks through providing a full set of equations that describe the complex word manipulations in the system. For an arbitrary word size, we derive alge- braic representations for the three main operations used in RC4, namely state extraction, word addition and state permutation. Equations relating the inter- nal states and keystream of RC4 are then obtained from each component of the cipher based on these al- gebraic representations, and analysed in terms of their contributions to the security of RC4 against algebraic attacks. Interestingly, it is shown that each of the three main operations contained in the components has its own unique algebraic properties, and when their respective equations are combined, the resulting system becomes infeasible to solve. This results in a high level of security being achieved by RC4 against algebraic attacks. On the other hand, the removal of an operation from the cipher could compromise this security. Experiments on reduced versions of RC4 have been performed, which confirms the validity of our algebraic analysis and the conclusion that the full RC4 stream cipher seems to be immune to algebraic attacks at present.
Resumo:
Experts in injection molding often refer to previous solutions to find a mold design similar to the current mold and use previous successful molding process parameters with intuitive adjustment and modification as a start for the new molding application. This approach saves a substantial amount of time and cost in experimental based corrective actions which are required in order to reach optimum molding conditions. A Case-Based Reasoning (CBR) System can perform the same task by retrieving a similar case which is applied to the new case from the case library and uses the modification rules to adapt a solution to the new case. Therefore, a CBR System can simulate human e~pertise in injection molding process design. This research is aimed at developing an interactive Hybrid Expert System to reduce expert dependency needed on the production floor. The Hybrid Expert System (HES) is comprised of CBR, flow analysis, post-processor and trouble shooting systems. The HES can provide the first set of operating parameters in order to achieve moldability condition and producing moldings free of stress cracks and warpage. In this work C++ programming language is used to implement the expert system. The Case-Based Reasoning sub-system is constructed to derive the optimum magnitude of process parameters in the cavity. Toward this end the Flow Analysis sub-system is employed to calculate the pressure drop and temperature difference in the feed system to determine the required magnitude of parameters at the nozzle. The Post-Processor is implemented to convert the molding parameters to machine setting parameters. The parameters designed by HES are implemented using the injection molding machine. In the presence of any molding defect, a trouble shooting subsystem can determine which combination of process parameters must be changed iii during the process to deal with possible variations. Constraints in relation to the application of this HES are as follows. - flow length (L) constraint: 40 mm < L < I 00 mm, - flow thickness (Th) constraint: -flow type: - material types: I mm < Th < 4 mm, unidirectional flow, High Impact Polystyrene (HIPS) and Acrylic. In order to test the HES, experiments were conducted and satisfactory results were obtained.
Resumo:
This dissertation develops the model of a prototype system for the digital lodgement of spatial data sets with statutory bodies responsible for the registration and approval of land related actions under the Torrens Title system. Spatial data pertain to the location of geographical entities together with their spatial dimensions and are classified as point, line, area or surface. This dissertation deals with a sub-set of spatial data, land boundary data that result from the activities performed by surveying and mapping organisations for the development of land parcels. The prototype system has been developed, utilising an event-driven paradigm for the user-interface, to exploit the potential of digital spatial data being generated from the utilisation of electronic techniques. The system provides for the creation of a digital model of the cadastral network and dependent data sets for an area of interest from hard copy records. This initial model is calibrated on registered control and updated by field survey to produce an amended model. The field-calibrated model then is electronically validated to ensure it complies with standards of format and content. The prototype system was designed specifically to create a database of land boundary data for subsequent retrieval by land professionals for surveying, mapping and related activities. Data extracted from this database are utilised for subsequent field survey operations without the need to create an initial digital model of an area of interest. Statistical reporting of differences resulting when subsequent initial and calibrated models are compared, replaces the traditional checking operations of spatial data performed by a land registry office. Digital lodgement of survey data is fundamental to the creation of the database of accurate land boundary data. This creation of the database is fundamental also to the efficient integration of accurate spatial data about land being generated by modem technology such as global positioning systems, and remote sensing and imaging, with land boundary information and other information held in Government databases. The prototype system developed provides for the delivery of accurate, digital land boundary data for the land registration process to ensure the continued maintenance of the integrity of the cadastre. Such data should meet also the more general and encompassing requirements of, and prove to be of tangible, longer term benefit to the developing, electronic land information industry.
Resumo:
The paper discusses the operating principles and control characteristics of a dynamic voltage restorer (DVR). It is assumed that the source voltages contain interharmonic components in addition to fundamental components. The main aim of the DVR is to produce a set of clean balanced sinusoidal voltages across the load terminals irrespective of unbalance, distortion and voltage sag/swell in the supply voltage. An algorithm has been discussed for extracting fundamental phasor sequence components from the samples of three-phase voltages or current waveforms having integer harmonics and interharmonics. The DVR operation based on extracted components is demonstrated. The switching signal is generated using a deadbeat controller. It has been shown that the DVR is able to compensate these interharmonic components such that the load voltages are perfectly regulated. The DVR operation under deep voltage sag is also discussed. The proposed DVR operation is verified through the computer simulation studies using the MATLAB software package.
Resumo:
Abstract—The role of cardiopulmonary signals in the dynamics of wavefront aberrations in the eye has been examined. Synchronous measurement of the eye’s wavefront aberrations, cardiac function, blood pulse, and respiration signals were taken for a group of young, healthy subjects. Two focusing stimuli, three breathing patterns, as well as natural and cycloplegic eye conditions were examined. A set of tools, including time–frequency coherence and its metrics, has been proposed to acquire a detailed picture of the interactions of the cardiopulmonary system with the eye’s wavefront aberrations. The results showed that the coherence of the blood pulse and its harmonics with the eye’s aberrations was, on average, weak (0.4 ± 0.15), while the coherence of the respiration signal with eye’s aberrations was, on average, moderate (0.53 ± 0.14). It was also revealed that there were significant intervals during which high coherence occurred. On average, the coherence was high (>0.75) during 16% of the recorded time, for the blood pulse, and 34% of the time for the respiration signal. A statistically significant decrease in average coherence was noted for the eye’s aberrations with respiration in the case of fast controlled breathing (0.5 Hz). The coherence between the blood pulse and the defocus was significantly larger for the far target than for the near target condition. After cycloplegia, the coherence of defocus with the blood pulse significantly decreased, while this was not the case for the other aberrations. There was also a noticeable, but not statistically significant, increase in the coherence of the comatic term and respiration in that case. By using nonstationary measures of signal coherence, a more detailed picture of interactions between the cardiopulmonary signals and eye’s wavefront aberrations has emerged.
Resumo:
Bronfenbrenner.s Bioecological Model, expressed as the developmental equation, D f PPCT, is the theoretical framework for two studies that bring together diverse strands of psychology to study the work-life interface of working adults. Occupational and organizational psychology is focused on the demands and resources of work and family, without emphasising the individual in detail. Health and personality psychology examine the individual but without emphasis on the individual.s work and family roles. The current research used Bronfenbrenner.s theoretical framework to combine individual differences, work and family to understand how these factors influence the working adult.s psychological functioning. Competent development has been defined as high well-being (measured as life satisfaction and psychological well-being) and high work engagement (as work vigour, work dedication and absorption in work) and as the absence of mental illness (as depression, anxiety and stress) and the absence of burnout (as emotional exhaustion, cynicism and professional efficacy). Study 1 and 2 were linked, with Study 1 as a cross-sectional survey and Study 2, a prospective panel study that followed on from the data used in Study1. Participants were recruited from a university and from a large public hospital to take part in a 3-wave, online study where they completed identical surveys at 3-4 month intervals (N = 470 at Time 1 and N = 198 at Time 3). In Study 1, hierarchical multiple regressions were used to assess the effects of individual differences (Block 1, e.g. dispositional optimism, coping self-efficacy, perceived control of time, humour), work and family variables (Block 2, e.g. affective commitment, skill discretion, work hours, children, marital status, family demands) and the work-life interface (Block 3, e.g. direction and quality of spillover between roles, work-life balance) on the outcomes. There were a mosaic of predictors of the outcomes with a group of seven that were the most frequent significant predictors and which represented the individual (dispositional optimism and coping self-efficacy), the workplace (skill discretion, affective commitment and job autonomy) and the work-life interface (negative work-to-family spillover and negative family-to-work spillover). Interestingly, gender and working hours were not important predictors. The effects of job social support, generally and for work-life issues, perceived control of time and egalitarian gender roles on the outcomes were mediated by negative work-to-family spillover, particularly for emotional exhaustion. Further, the effect of negative spillover on depression, anxiety and work engagement was moderated by the individual.s personal and workplace resources. Study 2 modelled the longitudinal relationships between the group of the seven most frequent predictors and the outcomes. Using a set of non-nested models, the relative influences of concurrent functioning, stability and change over time were assessed. The modelling began with models at Time 1, which formed the basis for confirmatory factor analysis (CFA) to establish the underlying relationships between the variables and calculate the composite variables for the longitudinal models. The CFAs were well fitting with few modifications to ensure good fit. However, using burnout and work engagement together required additional analyses to resolve poor fit, with one factor (representing a continuum from burnout to work engagement) being the only acceptable solution. Five different longitudinal models were investigated as the Well-Being, Mental Distress, Well-Being-Mental Health, Work Engagement and Integrated models using differing combinations of the outcomes. The best fitting model for each was a reciprocal model that was trimmed of trivial paths. The strongest paths were the synchronous correlations and the paths within variables over time. The reciprocal paths were more variable with weak to mild effects. There was evidence of gain and loss spirals between the variables over time, with a slight net gain in resources that may provide the mechanism for the accumulation of psychological advantage over a lifetime. The longitudinal models also showed that there are leverage points at which personal, psychological and managerial interventions can be targeted to bolster the individual and provide supportive workplace conditions that also minimise negative spillover. Bronfenbrenner.s developmental equation has been a useful framework for the current research, showing the importance of the person as central to the individual.s experience of the work-life interface. By taking control of their own life, the individual can craft a life path that is most suited to their own needs. Competent developmental outcomes were most likely where the person was optimistic and had high self-efficacy, worked in a job that they were attached to and which allowed them to use their talents and without too much negative spillover between their work and family domains. In this way, individuals had greater well-being, better mental health and greater work engagement at any one time and across time.
Resumo:
A set of non-nested longitudinal models tested the relationships between personal and workplace resources, well-being and work engagement. The reciprocal model, trimmed of trivial paths had the best fit and parsimony. The model showed the strong influences of concurrent functioning, stability of variables over time and weaker reciprocal relationships between variables across time. Individuals with greater confidence in themselves and the future experience better work conditions and have greater well-being and work engagement. These day-to-day influences are equalled by the long term strength and stability of Individual Factors, Positive Workplace Factors, and Overall Well-Being. Whilst the reciprocal paths had only weak to mild effects, there was mutual reinforcement of Individual Factors and Overall Well-Being, with Positive Workplace Factors and Work Engagement counterbalancing each other, indicating a more complex relationship. Well-being, particularly, is anchored in the immediate and distant past and provides a robust stability to functioning into the future.
Resumo:
Delegation, from the technical point of view, is widely considered as a potential approach in addressing the problem of providing dynamic access control decisions in activities with a high level of collaboration, either within a single security domain or across multiple security domains. Although delegation continues to attract significant attention from the research community, presently, there is no published work that presents a taxonomy of delegation concepts and models. This paper intends to address this gap by presenting a set of taxonomic criteria relevant to the concept of delegation and applies the taxonomy to a selection of significant delegation models published in the literature.
Resumo:
Delegation, from a technical point of view, is widely considered as a potential approach in addressing the problem of providing dynamic access control decisions in activities with a high level of collaboration, either within a single security domain or across multiple security domains. Although delegation continues to attract significant attention from the research community, presently, there is no published work that presents a taxonomy of delegation concepts and models. This article intends to address this gap by presenting a set of taxonomic criteria relevant to the concept of delegation. This article also applies the taxonomy to a selection of significant delegation models published in the literature.
Resumo:
This article examines the moment of exchange between artist, audience and culture in Live Art. Drawing on historical and contemporary examples, including examples from the Exist in 08 Live Art Event in Brisbane, Australia, in October 2008, it argues that Live Art - be it body art, activist art, site-specific performance, or other sorts of performative intervention in the public sphere - is characterised by a common set of claims about activating audiences, asking them to reflect on cultural norms challenged in the work. Live Art presents risky actions, in a context that blurs the boundaries between art and reality, to position audients as ‘witnesses’ who are personally implicated in, and responsible for, the actions unfolding before them. This article problematises assumptions about the way the uncertainties embedded in the Live Art encounter contribute to its deconstructive agenda. It uses the ethical theory of Emmanuel Levinas, Hans-Thies Lehmann and Dwight Conquergood to examine the mechanics of reductive, culturally-recuperative readings that can limit the efficacy of the Live Art encounter. It argues that, though ‘witnessing’ in Live Art depends on a relation to the real - real people, taking real risks, in real places - if it fails to foreground theatrical frame it is difficult for audients to develop the dual consciousness of the content, and their complicity in that content, that is the starting point for reflexivity, and response-ability, in the ethical encounter.
Resumo:
The emergence of ePortfolios is relatively recent in the university sector as a way to engage students in their learning and assessment, and to produce records of their accomplishments. An ePortfolio is an online tool that students can utilise to record, catalogue, retrieve and present reflections and artefacts that support and demonstrate the development of graduate students’ capabilities and professional standards across university courses. The ePortfolio is therefore considered as both process and product. Although ePortfolios show promise as a useful tool and their uptake has grown, they are not yet a mainstream higher education technology. To date, the emphasis has been on investigating their potential to support the multiple purposes of learning, assessment and employability, but less is known about whether and how students engage with ePortfolios in the university setting. This thesis investigates student engagement with an ePortfolio in one university. As the educational designer for the ePortfolio project at the University, I was uniquely positioned as a researching professional to undertake an inquiry into whether students were engaging with the ePortfolio. The participants in this study were a cohort (defined by enrolment in a unit of study) of second and third year education students (n=105) enrolled in a four year Bachelor of Education degree. The students were introduced to the ePortfolio in an introductory lecture and a hands-on workshop in a computer laboratory. They were subsequently required to complete a compulsory assessment task – a critical reflection - using the ePortfolio. Following that, engagement with the ePortfolio was voluntary. A single case study approach arising from an interpretivist paradigm directed the methodological approach and research design for this study. The study investigated the participants’ own accounts of their experiences with the ePortfolio, including how and when they engaged with the ePortfolio and the factors that impacted on their engagement. Data collection methods consisted of an attitude survey, student interviews, document collection, a researcher reflective journal and researcher observations. The findings of the study show that, while the students were encouraged to use the ePortfolio as a learning and employability tool, most students ultimately chose to disengage after completing the assessment task. Only six of the forty-five students (13%) who completed the research survey had used the ePortfolio in a sustained manner. The data obtained from the students during this research has provided insight into reasons why they disengaged from the ePortfolio. The findings add to the understandings and descriptions of student engagement with technology, and more broadly, advance the understanding of ePortfolios. These findings also contribute to the interdisciplinary field of technology implementation. There are three key outcomes from this study, a model of student engagement with technology, a set of criteria for the design of an ePortfolio, and a set of recommendations for effective practice for those implementing ePortfolios. The first, the Model of Student Engagement with Technology (MSET) (Version 2) explored student engagement with technology by highlighting key engagement decision points for students The model was initially conceptualised by building on work of previous research (Version 1), however, following data analysis a new model emerged, MSET (Version 2). The engagement decision points were identified as: • Prior Knowledge and Experience, leading to imagined usefulness and imagined ease of use; • Initial Supported Engagement, leading to supported experience of usefulness and supported ease of use; • Initial Independent Engagement, leading to actual experience of independent usefulness and actual ease of use; and • Ongoing Independent Engagement, leading to ongoing experience of usefulness and ongoing ease of use. The Model of Student Engagement with Technology (MSET) goes beyond numerical figures of usage to demonstrate student engagement with an ePortfolio. The explanatory power of the model is based on the identification of the types of decisions that students make and when they make them during the engagement process. This model presents a greater depth of understanding student engagement than was previously available and has implications for the direction and timing of future implementation, and academic and student development activities. The second key outcome from this study is a set of criteria for the re-conceptualisation of the University ePortfolio. The knowledge gained from this research has resulted in a new set of design criteria that focus on the student actions of writing reflections and adding artefacts. The process of using the ePortfolio is reconceptualised in terms of privileging student learning over administrative compliance. The focus of the ePortfolio is that the writing of critical reflections is the key function, not the selection of capabilities. The third key outcome from this research consists of five recommendations for university practice that have arisen from this study. They are that, sustainable implementation is more often achieved through small steps building on one another; that a clear definition of the purpose of an ePortfolio is crucial for students and staff; that ePortfolio pedagogy should be the driving force not the technology; that the merit of the ePortfolio is fostered in students and staff; and finally, that supporting delayed task performance is crucial. Students do not adopt an ePortfolio just because it is provided. While students must accept responsibility for their own engagement with the ePortfolio, the institution has to accept responsibility for providing the environment, and technical and pedagogical support to foster engagement. Ultimately, an ePortfolio should be considered as a joint venture between student and institution where strong returns on investment can be realised by both. It is acknowledged that the current implementation strategies for the ePortfolio are just the beginning of a much longer process. The real rewards for students, academics and the university lie in the future.
Resumo:
The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.
Resumo:
We study the regret of optimal strategies for online convex optimization games. Using von Neumann's minimax theorem, we show that the optimal regret in this adversarial setting is closely related to the behavior of the empirical minimization algorithm in a stochastic process setting: it is equal to the maximum, over joint distributions of the adversary's action sequence, of the difference between a sum of minimal expected losses and the minimal empirical loss. We show that the optimal regret has a natural geometric interpretation, since it can be viewed as the gap in Jensen's inequality for a concave functional--the minimizer over the player's actions of expected loss--defined on a set of probability distributions. We use this expression to obtain upper and lower bounds on the regret of an optimal strategy for a variety of online learning problems. Our method provides upper bounds without the need to construct a learning algorithm; the lower bounds provide explicit optimal strategies for the adversary. Peter L. Bartlett, Alexander Rakhlin