933 resultados para real world
Resumo:
Most current computer systems authorise the user at the start of a session and do not detect whether the current user is still the initial authorised user, a substitute user, or an intruder pretending to be a valid user. Therefore, a system that continuously checks the identity of the user throughout the session is necessary without being intrusive to end-user and/or effectively doing this. Such a system is called a continuous authentication system (CAS). Researchers have applied several approaches for CAS and most of these techniques are based on biometrics. These continuous biometric authentication systems (CBAS) are supplied by user traits and characteristics. One of the main types of biometric is keystroke dynamics which has been widely tried and accepted for providing continuous user authentication. Keystroke dynamics is appealing for many reasons. First, it is less obtrusive, since users will be typing on the computer keyboard anyway. Second, it does not require extra hardware. Finally, keystroke dynamics will be available after the authentication step at the start of the computer session. Currently, there is insufficient research in the CBAS with keystroke dynamics field. To date, most of the existing schemes ignore the continuous authentication scenarios which might affect their practicality in different real world applications. Also, the contemporary CBAS with keystroke dynamics approaches use characters sequences as features that are representative of user typing behavior but their selected features criteria do not guarantee features with strong statistical significance which may cause less accurate statistical user-representation. Furthermore, their selected features do not inherently incorporate user typing behavior. Finally, the existing CBAS that are based on keystroke dynamics are typically dependent on pre-defined user-typing models for continuous authentication. This dependency restricts the systems to authenticate only known users whose typing samples are modelled. This research addresses the previous limitations associated with the existing CBAS schemes by developing a generic model to better identify and understand the characteristics and requirements of each type of CBAS and continuous authentication scenario. Also, the research proposes four statistical-based feature selection techniques that have highest statistical significance and encompasses different user typing behaviors which represent user typing patterns effectively. Finally, the research proposes the user-independent threshold approach that is able to authenticate a user accurately without needing any predefined user typing model a-priori. Also, we enhance the technique to detect the impostor or intruder who may take over during the entire computer session.
Resumo:
Traditionally, Science education has stressed the importance of teaching students to conduct ‘scientific inquiry’, with the main focus being the experimental model of inquiry used by real world scientists. Current educational approaches using constructivist pedagogy recognise the value of inquiry as a method for promoting the development of deep understanding of discipline content. A recent Information Learning Activity undertaken by a Grade Eight Science class was observed to discover how inquiry based learning is implemented in contemporary Science education. By analysing student responses to questionnaires and assessment task outcomes, the author was able to determine the level of inquiry inherent in the activity and how well the model supported student learning and the development of students’ information literacy skills. Although students achieved well overall, some recommendations are offered that may enable teachers to better exploit the learning opportunities provided by inquiry based learning. Planning interventions at key stages of the inquiry process can assist students to learn more effective strategies for dealing with cognitive and affective challenges. Allowing students greater input into the selection of topic or focus of the activity may encourage students to engage more deeply with the learning task. Students are likely to experience greater learning benefit from access to developmentally appropriate resources, increased time to explore topics and multiple opportunities to undertake information searches throughout the learning activity. Finally, increasing the cognitive challenge can enhance both the depth of students’ learning and their information literacy skills.
Resumo:
In various industrial and scientific fields, conceptual models are derived from real world problem spaces to understand and communicate containing entities and coherencies. Abstracted models mirror the common understanding and information demand of engineers, who apply conceptual models for performing their daily tasks. However, most standardized models in Process Management, Product Lifecycle Management and Enterprise Resource Planning lack of a scientific foundation for their notation. In collaboration scenarios with stakeholders from several disciplines, tailored conceptual models complicate communication processes, as a common understanding is not shared or implemented in specific models. To support direct communication between experts from several disciplines, a visual language is developed which allows a common visualization of discipline-specific conceptual models. For visual discrimination and to overcome visual complexity issues, conceptual models are arranged in a three-dimensional space. The visual language introduced here follows and extends established principles of Visual Language science.
Resumo:
Problem-based learning (PBL) has been used successfully in disciplines such as medicine, nursing, law and engineering. However a review of the literature shows that there has been little use of this approach to learning in accounting. This paper extends the research in accounting education by reporting the findings of a case study of the development and implementation of PBL at the Queensland University of Technology (QUT) in a new Accountancy Capstone unit that began in 2006. The fundamentals of the PBL approach were adhered to. However, one of the essential elements of the approach adopted was to highlight the importance of questioning as a means of gathering the necessary information upon which decisions are made. This approach can be contrasted with the typical ‘give all the facts’ case studies that are commonly used. Another feature was that students worked together in the same group for an entire semester (similar to how teams in the workplace operate) so there was an intended focus on teamwork in solving unstructured, real-world accounting problems presented to students. Based on quantitative and qualitative data collected from student questionnaires over seven semesters, it was found that students perceived PBL to be effective, especially in terms of developing the skills of questioning, teamwork, and problem solving. The effectiveness of questioning is very important as this is a skill that is rarely the focus of development in accounting education. The successful implementation of PBL in accounting through ‘learning by doing’ could be the catalyst for change to bring about better learning outcomes for accounting graduates.
Resumo:
Airports and cities inevitably recognise the value that each brings the other; however, the separation in decision-making authority for what to build, where, when and how provides a conundrum for both parties. Airports often want a say in what is developed outside of the airport fence, and cities often want a say in what is developed inside the airport fence. Defining how much of a say airports and cities have in decisions beyond their jurisdictional control is likely to be a topic that continues so long as airports and cities maintain separate formal decision-making processes for what to build, where, when and how. However, the recent Green and White Papers for a new National Aviation Policy have made early inroads to formalising relationships between Australia’s major airports and their host cities. At present, no clear indication (within practice or literature) is evident to the appropriateness of different governance arrangements for decisions to develop in situations that bring together the opposing strategic interests of airports and cities; thus leaving decisions for infrastructure development as complex decision-making spaces that hold airport and city/regional interests at stake. The line of enquiry is motivated by a lack of empirical research on networked decision-making domains outside of the realm of institutional theorists (Agranoff & McGuire, 2001; Provan, Fish & Sydow, 2007). That is, governance literature has remained focused towards abstract conceptualisations of organisation, without focusing on the minutia of how organisation influences action in real-world applications. A recent study by Black (2008) has provided an initial foothold for governance researchers into networked decision-making domains. This study builds upon Black’s (2008) work by aiming to explore and understand the problem space of making decisions subjected to complex jurisdictional and relational interdependencies. That is, the research examines the formal and informal structures, relationships, and forums that operationalise debates and interactions between decision-making actors as they vie for influence over deciding what to build, where, when and how in airport-proximal development projects. The research mobilises a mixture of qualitative and quantitative methods to examine three embedded cases of airport-proximal development from a network governance perspective. Findings from the research provide a new understanding to the ways in which informal actor networks underpin and combine with formal decision-making networks to create new (or realigned) governance spaces that facilitate decision-making during complex phases of development planning. The research is timely, and responds well to Isett, Mergel, LeRoux, Mischen and Rethemeyer’s (2011) recent critique of limitations within current network governance literature, specifically to their noted absence of empirical studies that acknowledge and interrogate the simultaneity of formal and informal network structures within network governance arrangements (Isett et al., 2011, pp. 162-166). The combination of social network analysis (SNA) techniques and thematic enquiry has enabled findings to document and interpret the ways in which decision-making actors organise to overcome complex problems for planning infrastructure. An innovative approach to using association networks has been used to provide insights to the importance of the different ways actors interact with one another, thus providing a simple yet valuable addition to the increasingly popular discipline of SNA. The research also identifies when and how different types of networks (i.e. formal and informal) are able to overcome currently known limitations to network governance (see McGuire & Agranoff, 2011), thus adding depth to the emerging body of network governance literature surrounding limitations to network ways of working (i.e. Rhodes, 1997a; Keast & Brown, 2002; Rethemeyer & Hatmaker, 2008; McGuire & Agranoff, 2011). Contributions are made to practice via the provision of a timely understanding of how horizontal fora between airports and their regions are used, particularly in the context of how they reframe the governance of decision-making for airport-proximal infrastructure development. This new understanding will enable government and industry actors to better understand the structural impacts of governance arrangements before they design or adopt them, particularly for factors such as efficiency of information, oversight, and responsiveness to change.
Resumo:
A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.
Resumo:
Reliable ambiguity resolution (AR) is essential to Real-Time Kinematic (RTK) positioning and its applications, since incorrect ambiguity fixing can lead to largely biased positioning solutions. A partial ambiguity fixing technique is developed to improve the reliability of AR, involving partial ambiguity decorrelation (PAD) and partial ambiguity resolution (PAR). Decorrelation transformation could substantially amplify the biases in the phase measurements. The purpose of PAD is to find the optimum trade-off between decorrelation and worst-case bias amplification. The concept of PAR refers to the case where only a subset of the ambiguities can be fixed correctly to their integers in the integer least-squares (ILS) estimation system at high success rates. As a result, RTK solutions can be derived from these integer-fixed phase measurements. This is meaningful provided that the number of reliably resolved phase measurements is sufficiently large for least-square estimation of RTK solutions as well. Considering the GPS constellation alone, partially fixed measurements are often insufficient for positioning. The AR reliability is usually characterised by the AR success rate. In this contribution an AR validation decision matrix is firstly introduced to understand the impact of success rate. Moreover the AR risk probability is included into a more complete evaluation of the AR reliability. We use 16 ambiguity variance-covariance matrices with different levels of success rate to analyse the relation between success rate and AR risk probability. Next, the paper examines during the PAD process, how a bias in one measurement is propagated and amplified onto many others, leading to more than one wrong integer and to affect the success probability. Furthermore, the paper proposes a partial ambiguity fixing procedure with a predefined success rate criterion and ratio-test in the ambiguity validation process. In this paper, the Galileo constellation data is tested with simulated observations. Numerical results from our experiment clearly demonstrate that only when the computed success rate is very high, the AR validation can provide decisions about the correctness of AR which are close to real world, with both low AR risk and false alarm probabilities. The results also indicate that the PAR procedure can automatically chose adequate number of ambiguities to fix at given high-success rate from the multiple constellations instead of fixing all the ambiguities. This is a benefit that multiple GNSS constellations can offer.
Resumo:
Middle schooling is a crucial area of education where adolescents experiencing physiological and psychological hanges require expert guidance. As more research evidence is provided about adolescent learning, teachers are considered pivotal to adolescents’ educational development. The two levels of implementing reform measures need to be targeted, that is, at the inservice and preservice teacher levels. This quantitative study employs a 40-item, five-part Likert scale survey to understand preservice teachers’ (n=142) perceptions of their confidence to teach in the middle school at the conclusion of their tertiary education. The survey instrument was developed from the literature with connections to the Queensland College of Teachers professional standards. Results indicated that they perceived themselves as capable of creating a positive classroom environment with seven items greater than 80%, except with behaviour management (<80% for two items) and they considered their pedagogical knowledge to be adequate (i.e., 7 out of 8 items >84%). Items associated with implementing middle schooling curriculum had varied responses (e.g., implementing literacy and numeracy were 74% while implementing learning with real-world connections was 91%). This information may assist coursework designers. For example, if significant percentages of preservice teachers indicate they believe they were not well prepared for assessment and reporting in the middle school then course designers can target these areas more effectively.
Resumo:
Distributed Genetic Algorithms (DGAs) designed for the Internet have to take its high communication cost into consideration. For island model GAs, the migration topology has a major impact on DGA performance. This paper describes and evaluates an adaptive migration topology optimizer that keeps the communication load low while maintaining high solution quality. Experiments on benchmark problems show that the optimized topology outperforms static or random topologies of the same degree of connectivity. The applicability of the method on real-world problems is demonstrated on a hard optimization problem in VLSI design.
Resumo:
Purpose Anecdotal evidence suggests that some sunglass users prefer yellow tints for outdoor activities, such as driving, and research has suggested that such tints improve the apparent contrast and brightness of real-world objects. The aim of this study was to establish whether yellow filters resulted in objective improvements in performance for visual tasks relevant to driving. Methods Response times of nine young (age [mean ± SD], 31.4 ± 6.7 years) and nine older (age, [mean ± SD], 74.6 ± 4.8) adults were measured using video presentations of traffic hazards (driving hazard perception task) and a simple low-contrast grating appeared at random peripheral locations on a computer screen. Response times were compared when participants wore a yellow filter (with and without a linear polarizer) versus a neutral density filter (with and without a linear polarizer). All lens combinations were matched to have similar luminance transmittances (˜27%). Results In the driving hazard perception task, the young but not the older participants responded significantly more rapidly to hazards when wearing a yellow filter than with a luminance-matched neutral density filter (mean difference, 450 milliseconds). In the low-contrast grating task, younger participants also responded more quickly for the yellow filter condition but only when combined with a polarizer. Although response times increased with increasing stimulus eccentricity for the low-contrast grating task, for the younger participants, this slowing of response times with increased eccentricity was reduced in the presence of a yellow filter, indicating that perception of more peripheral objects may be improved by this filter combination. Conclusions Yellow filters improve response times for younger adults for visual tasks relevant to driving.
Resumo:
In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is also proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Experiments based on several real-world data collections demonstrate that WebPut outperforms existing approaches.
Resumo:
This paper introduces PartSS, a new partition-based fil- tering for tasks performing string comparisons under edit distance constraints. PartSS offers improvements over the state-of-the-art method NGPP with the implementation of a new partitioning scheme and also improves filtering abil- ities by exploiting theoretical results on shifting and scaling ranges, thus accelerating the rate of calculating edit distance between strings. PartSS filtering has been implemented within two major tasks of data integration: similarity join and approximate membership extraction under edit distance constraints. The evaluation on an extensive range of real-world datasets demonstrates major gain in efficiency over NGPP and QGrams approaches.
Resumo:
In this article, we report on the findings of an exploratory study into the experience of undergraduate students as they learn new mathematical models. Qualitative and quanti- tative data based around the students’ approaches to learning new mathematical models were collected. The data revealed that students actively adopt three approaches to under- standing a new mathematical model: gathering information for the task of understanding the model, practising with and using the model, and finding interrelationships between elements of the model. We found that the students appreciate mathematical models that have a real world application and that this can be used to engage students in higher level learning approaches.
Resumo:
We conducted an exploratory study of a mobile energy monitoring tool: The Dashboard. Our point of departure from prior work was the emphasis of end-user customisation and social sharing. Applying extensive feedback, we deployed the Dashboard in real-world conditions to socially linked research participants for a period of five weeks. Participants were encouraged to devise, construct, place, and view various data feeds. The aim of our study was to test the assumption that participants, having control over their Dashboard configuration, would engage, and remain engaged, with their energy feedback throughout the trial. Our research points to a set of design issues surrounding the adoption and continued use of such tools. A novel finding of our study is the impact of social links between participants and their continued engagement with the Dashboard. Our results also illustrate the emergence of energy-voyeurism, a form of social energy monitoring by peers.
Resumo:
‘Social innovation’ is a construct increasingly used to explain the practices, processes and actors through which sustained positive transformation occurs in the network society (Mulgan, G., Tucker, S., Ali, R., Sander, B. (2007). Social innovation: What it is, why it matters and how can it be accelerated. Oxford:Skoll Centre for Social Entrepreneurship; Phills, J. A., Deiglmeier, K., & Miller, D. T. Stanford Social Innovation Review, 6(4):34–43, 2008.). Social innovation has been defined as a “novel solution to a social problem that is more effective, efficient, sustainable, or just than existing solutions, and for which the value created accrues primarily to society as a whole rather than private individuals.” (Phills,J. A., Deiglmeier, K., & Miller, D. T. Stanford Social Innovation Review, 6 (4):34–43, 2008: 34.) Emergent ideas of social innovation challenge some traditional understandings of the nature and role of the Third Sector, as well as shining a light on those enterprises within the social economy that configure resources in novel ways. In this context, social enterprises – which provide a social or community benefit and trade to fulfil their mission – have attracted considerable policy attention as one source of social innovation within a wider field of action (see Leadbeater, C. (2007). ‘Social enterprise and social innovation: Strategies for the next 10 years’, Cabinet office,Office of the third sector http://www.charlesleadbeater.net/cms xstandard/social_enterprise_innovation.pdf. Last accessed 19/5/2011.). And yet, while social enterprise seems to have gained some symbolic traction in society, there is to date relatively limited evidence of its real world impacts.(Dart, R. Not for Profit Management and Leadership, 14(4):411–424, 2004.) In other words, we do not know much about the social innovation capabilities and effects of social enterprise. In this chapter, we consider the social innovation practices of social enterprise, drawing on Mulgan, G., Tucker, S., Ali, R., Sander, B. (2007). Social innovation: What it is, why it matters and how can it be accelerated. Oxford: Skoll Centre for Social Entrepreneurship: 5) three dimensions of social innovation: new combinations or hybrids of existing elements; cutting across organisational, sectoral and disciplinary boundaries; and leaving behind compelling new relationships. Based on a detailed survey of 365 Australian social enterprises, we examine their self-reported business and mission-related innovations, the ways in which they configure and access resources and the practices through which they diffuse innovation in support of their mission. We then consider how these findings inform our understanding of the social innovation capabilities and effects of social enterprise,and their implications for public policy development.