48 resultados para Real-life Projects
Resumo:
The construction field is dynamic and dominated by complex, ill-defined problems for which myriad possible solutions exist. Teaching students to solve construction-related problems requires an understanding of the nature of these complex problems as well as the implementation of effective instructional strategies to address them. Traditional approaches to teaching construction planning and management have long been criticized for presenting students primarily with well-defined problems - an approach inconsistent with the challenges encountered in the industry. However, growing evidence suggests that employing innovative teaching approaches, such as interactive simulation games, offers more active, hands-on and problem-based learning opportunities for students to synthesize and test acquired knowledge more closely aligned with real-life construction scenarios. Simulation games have demonstrated educational value in increasing student problem solving skills and motivation through critical attributes such as interaction and feedback-supported active learning. Nevertheless, broad acceptance of simulation games in construction engineering education remains limited. While recognizing benefits, research focused on the role of simulation games in educational settings lacks a unified approach to developing, implementing and evaluating these games. To address this gap, this paper provides an overview of the challenges associated with evaluating the effectiveness of simulation games in construction education that still impede their wide adoption. An overview of the current status, as well as the results from recently implemented Virtual Construction Simulator (VCS) game at Penn State provide lessons learned, and are intended to guide future efforts in developing interactive simulation games to reach their full potential.
Resumo:
This paper introduces a novel approach for free-text keystroke dynamics authentication which incorporates the use of the keyboard’s key-layout. The method extracts timing features from specific key-pairs. The Euclidean distance is then utilized to find the level of similarity between a user’s profile data and his/her test data. The results obtained from this method are reasonable for free-text authentication while maintaining the maximum level of user relaxation. Moreover, it has been proven in this study that flight time yields better authentication results when compared with dwell time. In particular, the results were obtained with only one training sample for the purpose of practicality and ease of real life application.
Resumo:
In this article, we investigate how the choice of the attenuation factor in an extended version of Katz centrality influences the centrality of the nodes in evolving communication networks. For given snapshots of a network, observed over a period of time, recently developed communicability indices aim to identify the best broadcasters and listeners (receivers) in the network. Here we explore the attenuation factor constraint, in relation to the spectral radius (the largest eigenvalue) of the network at any point in time and its computation in the case of large networks. We compare three different communicability measures: standard, exponential, and relaxed (where the spectral radius bound on the attenuation factor is relaxed and the adjacency matrix is normalised, in order to maintain the convergence of the measure). Furthermore, using a vitality-based measure of both standard and relaxed communicability indices, we look at the ways of establishing the most important individuals for broadcasting and receiving of messages related to community bridging roles. We compare those measures with the scores produced by an iterative version of the PageRank algorithm and illustrate our findings with two examples of real-life evolving networks: the MIT reality mining data set, consisting of daily communications between 106 individuals over the period of one year, a UK Twitter mentions network, constructed from the direct \emph{tweets} between 12.4k individuals during one week, and a subset the Enron email data set.
Resumo:
This paper considers the use of Association Rule Mining (ARM) and our proposed Transaction based Rule Change Mining (TRCM) to identify the rule types present in tweet’s hashtags over a specific consecutive period of time and their linkage to real life occurrences. Our novel algorithm was termed TRCM-RTI in reference to Rule Type Identification. We created Time Frame Windows (TFWs) to detect evolvement statuses and calculate the lifespan of hashtags in online tweets. We link RTI to real life events by monitoring and recording rule evolvement patterns in TFWs on the Twitter network.
Resumo:
This paper presents an in-depth critical discussion and derivation of a detailed small-signal analysis of the Phase-Shifted Full-Bridge (PSFB) converter. Circuit parasitics, resonant inductance and transformer turns ratio have all been taken into account in the evaluation of this topology’s open-loop control-to-output, line-to-output and load-to-output transfer functions. Accordingly, the significant impact of losses and resonant inductance on the converter’s transfer functions is highlighted. The enhanced dynamic model proposed in this paper enables the correct design of the converter compensator, including the effect of parasitics on the dynamic behavior of the PSFB converter. Detailed experimental results for a real-life 36V-to-14V/10A PSFB industrial application show excellent agreement with the predictions from the model proposed herein.1
Resumo:
Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.
Resumo:
There has been an increased emphasis upon the application of science for humanitarian and development planning, decision-making and practice; particularly in the context of understanding, assessing and anticipating risk (e.g. HERR, 2011). However, there remains very little guidance for practitioners on how to integrate sciences they may have had little contact with in the past (e.g. climate). This has led to confusion as to which ‘science’ might be of use and how it would be best utilised. Furthermore, since this integration has stemmed from a need to be more predictive, agencies are struggling with the problems associated with uncertainty and probability. Whilst a range of expertise is required to build resilience, these guidelines focus solely upon the relevant data, information, knowledge, methods, principles and perspective which scientists can provide, that typically lie outside of current humanitarian and development approaches. Using checklists, real-life case studies and scenarios the full guidelines take practitioners through a five step approach to finding, understanding and applying science. This document provides a short summary of the five steps and some key lessons for integrating science.
Resumo:
We are looking into variants of a domination set problem in social networks. While randomised algorithms for solving the minimum weighted domination set problem and the minimum alpha and alpha-rate domination problem on simple graphs are already present in the literature, we propose here a randomised algorithm for the minimum weighted alpha-rate domination set problem which is, to the best of our knowledge, the first such algorithm. A theoretical approximation bound based on a simple randomised rounding technique is given. The algorithm is implemented in Python and applied to a UK Twitter mentions networks using a measure of individuals’ influence (klout) as weights. We argue that the weights of vertices could be interpreted as the costs of getting those individuals on board for a campaign or a behaviour change intervention. The minimum weighted alpha-rate dominating set problem can therefore be seen as finding a set that minimises the total cost and each individual in a network has at least alpha percentage of its neighbours in the chosen set. We also test our algorithm on generated graphs with several thousand vertices and edges. Our results on this real-life Twitter networks and generated graphs show that the implementation is reasonably efficient and thus can be used for real-life applications when creating social network based interventions, designing social media campaigns and potentially improving users’ social media experience.
Resumo:
More and more households are purchasing electric vehicles (EVs), and this will continue as we move towards a low carbon future. There are various projections as to the rate of EV uptake, but all predict an increase over the next ten years. Charging these EVs will produce one of the biggest loads on the low voltage network. To manage the network, we must not only take into account the number of EVs taken up, but where on the network they are charging, and at what time. To simulate the impact on the network from high, medium and low EV uptake (as outlined by the UK government), we present an agent-based model. We initialise the model to assign an EV to a household based on either random distribution or social influences - that is, a neighbour of an EV owner is more likely to also purchase an EV. Additionally, we examine the effect of peak behaviour on the network when charging is at day-time, night-time, or a mix of both. The model is implemented on a neighbourhood in south-east England using smart meter data (half hourly electricity readings) and real life charging patterns from an EV trial. Our results indicate that social influence can increase the peak demand on a local level (street or feeder), meaning that medium EV uptake can create higher peak demand than currently expected.
Resumo:
Background Models of the development and maintenance of childhood anxiety suggest an important role for parent cognitions: that is, negative expectations of children's coping abilities lead to parenting behaviors that maintain child anxiety. The primary aims of the current study were to (1) compare expectations of child vulnerability and coping among mothers of children with anxiety disorders on the basis of whether or not mothers also had a current anxiety disorder, and (2) examine the degree to which the association between maternal anxiety disorder status and child coping expectations was mediated by how mothers interpreted ambiguous material that referred to their own experience. Methods The association between interpretations of threat, negative emotion, and control was assessed using hypothetical ambiguous scenarios in a sample of 271 anxious and nonanxious mothers of 7- to 12-year-old children with an anxiety disorder. Mothers also rated their expectations when presented with real life challenge tasks. Results There was a significant association between maternal anxiety disorder status and negative expectations of child coping behaviors. Mothers’ self-referent interpretations were found to mediate this relationship. Responses to ambiguous hypothetical scenarios correlated significantly with responses to real life challenge tasks. Conclusions Treatments for childhood anxiety disorders in the context of parental anxiety disorders may benefit from the inclusion of a component to directly address parental cognitions. Some inconsistencies were found when comparing maternal expectations in response to hypothetical scenarios with real life challenges. This should be addressed in future research.
Resumo:
In the last decade, several research results have presented formulations for the auto-calibration problem. Most of these have relied on the evaluation of vanishing points to extract the camera parameters. Normally vanishing points are evaluated using pedestrians or the Manhattan World assumption i.e. it is assumed that the scene is necessarily composed of orthogonal planar surfaces. In this work, we present a robust framework for auto-calibration, with improved results and generalisability for real-life situations. This framework is capable of handling problems such as occlusions and the presence of unexpected objects in the scene. In our tests, we compare our formulation with the state-of-the-art in auto-calibration using pedestrians and Manhattan World-based assumptions. This paper reports on the experiments conducted using publicly available datasets; the results have shown that our formulation represents an improvement over the state-of-the-art.
Resumo:
Massive Open Online Courses (MOOCs) have become very popular among learners millions of users from around the world registered with leading platforms. There are hundreds of universities (and other organizations) offering MOOCs. However, sustainability of MOOCs is a pressing concern as MOOCs incur up front creation costs, maintenance costs to keep content relevant and on-going support costs to provide facilitation while a course is being run. At present, charging a fee for certification (for example Coursera Signature Track and FutureLearn Statement of Completion) seems a popular business model. In this paper, the authors discuss other possible business models and their pros and cons. Some business models discussed here are: Freemium model – providing content freely but charging for premium services such as course support, tutoring and proctored exams. Sponsorships – courses can be created in collaboration with industry where industry sponsorships are used to cover the costs of course production and offering. For example Teaching Computing course was offered by the University of East Anglia on the FutureLearn platform with the sponsorship from British Telecom while the UK Government sponsored the course Introduction to Cyber Security offered by the Open University on FutureLearn. Initiatives and Grants – The government, EU commission or corporations could commission the creation of courses through grants and initiatives according to the skills gap identified for the economy. For example, the UK Government’s National Cyber Security Programme has supported a course on Cyber Security. Similar initiatives could also provide funding to support relevant course development and offering. Donations – Free software, Wikipedia and early OER initiatives such as the MIT OpenCourseware accept donations from the public and this could well be used as a business model where learners could contribute (if they wish) to the maintenance and facilitation of a course. Merchandise – selling merchandise could also bring revenue to MOOCs. As many participants do not seek formal recognition (European Commission, 2014) for their completion of a MOOC, merchandise that presents their achievement in a playful way could well be attractive for them. Sale of supplementary material –supplementary course material in the form of an online or physical book or similar could be sold with the revenue being reinvested in the course delivery. Selective advertising – courses could have advertisements relevant to learners Data sharing – though a controversial topic, sharing learner data with relevant employers or similar could be another revenue model for MOOCs. Follow on events – the courses could lead to follow on summer schools, courses or other real-life or online events that are paid-for in which case a percentage of the revenue could be passed on to the MOOC for its upkeep. Though these models are all possible ways of generating revenue for MOOCs, some are more controversial and sensitive than others. Nevertheless unless appropriate business models are identified the sustainability of MOOCs would be problematic.
Video stimuli reduce object-directed imitation accuracy: a novel two-person motion-tracking approach
Resumo:
Imitation is an important form of social behavior, and research has aimed to discover and explain the neural and kinematic aspects of imitation. However, much of this research has featured single participants imitating in response to pre-recorded video stimuli. This is in spite of findings that show reduced neural activation to video vs. real life movement stimuli, particularly in the motor cortex. We investigated the degree to which video stimuli may affect the imitation process using a novel motion tracking paradigm with high spatial and temporal resolution. We recorded 14 positions on the hands, arms, and heads of two individuals in an imitation experiment. One individual freely moved within given parameters (moving balls across a series of pegs) and a second participant imitated. This task was performed with either simple (one ball) or complex (three balls) movement difficulty, and either face-to-face or via a live video projection. After an exploratory analysis, three dependent variables were chosen for examination: 3D grip position, joint angles in the arm, and grip aperture. A cross-correlation and multivariate analysis revealed that object-directed imitation task accuracy (as represented by grip position) was reduced in video compared to face-to-face feedback, and in complex compared to simple difficulty. This was most prevalent in the left-right and forward-back motions, relevant to the imitator sitting face-to-face with the actor or with a live projected video of the same actor. The results suggest that for tasks which require object-directed imitation, video stimuli may not be an ecologically valid way to present task materials. However, no similar effects were found in the joint angle and grip aperture variables, suggesting that there are limits to the influence of video stimuli on imitation. The implications of these results are discussed with regards to previous findings, and with suggestions for future experimentation.
Resumo:
Twitter has become a dependable microblogging tool for real time information dissemination and newsworthy events broadcast. Its users sometimes break news on the network faster than traditional newsagents due to their presence at ongoing real life events at most times. Different topic detection methods are currently used to match Twitter posts to real life news of mainstream media. In this paper, we analyse tweets relating to the English FA Cup finals 2012 by applying our novel method named TRCM to extract association rules present in hash tag keywords of tweets in different time-slots. Our system identify evolving hash tag keywords with strong association rules in each time-slot. We then map the identified hash tag keywords to event highlights of the game as reported in the ground truth of the main stream media. The performance effectiveness measure of our experiments show that our method perform well as a Topic Detection and Tracking approach.
Resumo:
Anticipating the number and identity of bidders has significant influence in many theoretical results of the auction itself and bidders' bidding behaviour. This is because when a bidder knows in advance which specific bidders are likely competitors, this knowledge gives a company a head start when setting the bid price. However, despite these competitive implications, most previous studies have focused almost entirely on forecasting the number of bidders and only a few authors have dealt with the identity dimension qualitatively. Using a case study with immediate real-life applications, this paper develops a method for estimating every potential bidder's probability of participating in a future auction as a function of the tender economic size removing the bias caused by the contract size opportunities distribution. This way, a bidder or auctioner will be able to estimate the likelihood of a specific group of key, previously identified bidders in a future tender.