885 resultados para predictive models
Resumo:
The diverse needs of children have been drawing global attention from both academic and practitioner communities. Based on semi-structured interviews with 23 kin caregivers and five school personnel in the Shijiapu Town of Jilin Province, China, this paper presents a needs model for rural school-age children left behind by their migrant parents. This Chinese model is compared to the needs identification mechanism developed by the Australian Research Alliance for Children and youth. The paper outlines the common needs of children in different contexts, and also highlights the needs that are not explicit in the Australian Research Alliance for Children and Youth framework, such as empowerment and agency or perhaps given insufficient weight, such as education. In discussing relationships among different needs, aspects that are missing in the framework it is argued that culture should be more explicitly recognised when defining need.
Resumo:
Quality of experience (QoE) measures the overall perceived quality of mobile video delivery from subjective user experience and objective system performance. Current QoE computing models have two main limitations: 1) insufficient consideration of the factors influencing QoE, and; 2) limited studies on QoE models for acceptability prediction. In this paper, a set of novel acceptability-based QoE models, denoted as A-QoE, is proposed based on the results of comprehensive user studies on subjective quality acceptance assessments. The models are able to predict users’ acceptability and pleasantness in various mobile video usage scenarios. Statistical regression analysis has been used to build the models with a group of influencing factors as independent predictors, including encoding parameters and bitrate, video content characteristics, and mobile device display resolution. The performance of the proposed A-QoE models has been compared with three well-known objective Video Quality Assessment metrics: PSNR, SSIM and VQM. The proposed A-QoE models have high prediction accuracy and usage flexibility. Future user-centred mobile video delivery systems can benefit from applying the proposed QoE-based management to optimize video coding and quality delivery decisions.
Resumo:
Whole-image descriptors such as GIST have been used successfully for persistent place recognition when combined with temporal filtering or sequential filtering techniques. However, whole-image descriptor localization systems often apply a heuristic rather than a probabilistic approach to place recognition, requiring substantial environmental-specific tuning prior to deployment. In this paper we present a novel online solution that uses statistical approaches to calculate place recognition likelihoods for whole-image descriptors, without requiring either environmental tuning or pre-training. Using a real world benchmark dataset, we show that this method creates distributions appropriate to a specific environment in an online manner. Our method performs comparably to FAB-MAP in raw place recognition performance, and integrates into a state of the art probabilistic mapping system to provide superior performance to whole-image methods that are not based on true probability distributions. The method provides a principled means for combining the powerful change-invariant properties of whole-image descriptors with probabilistic back-end mapping systems without the need for prior training or system tuning.
Resumo:
Topic modelling, such as Latent Dirichlet Allocation (LDA), was proposed to generate statistical models to represent multiple topics in a collection of documents, which has been widely utilized in the fields of machine learning and information retrieval, etc. But its effectiveness in information filtering is rarely known. Patterns are always thought to be more representative than single terms for representing documents. In this paper, a novel information filtering model, Pattern-based Topic Model(PBTM) , is proposed to represent the text documents not only using the topic distributions at general level but also using semantic pattern representations at detailed specific level, both of which contribute to the accurate document representation and document relevance ranking. Extensive experiments are conducted to evaluate the effectiveness of PBTM by using the TREC data collection Reuters Corpus Volume 1. The results show that the proposed model achieves outstanding performance.
Resumo:
In this age of rapidly evolving technology, teachers are encouraged to adopt ICTs by government, syllabus, school management, and parents. Indeed, it is an expectation that teachers will incorporate technologies into their classroom teaching practices to enhance the learning experiences and outcomes of their students. In particular, regarding the science classroom, a subject that traditionally incorporates hands-on experiments and practicals, the integration of modern technologies should be a major feature. Although myriad studies report on technologies that enhance students’ learning outcomes in science, there is a dearth of literature on how teachers go about selecting technologies for use in the science classroom. Teachers can feel ill prepared to assess the range of available choices and might feel pressured and somewhat overwhelmed by the avalanche of new developments thrust before them in marketing literature and teaching journals. The consequences of making bad decisions are costly in terms of money, time and teacher confidence. Additionally, no research to date has identified what technologies science teachers use on a regular basis, and whether some purchased technologies have proven to be too problematic, preventing their sustained use and possible wider adoption. The primary aim of this study was to provide research-based guidance to teachers to aid their decision-making in choosing technologies for the science classroom. The study unfolded in several phases. The first phase of the project involved survey and interview data from teachers in relation to the technologies they currently use in their science classrooms and the frequency of their use. These data were coded and analysed using Grounded Theory of Corbin and Strauss, and resulted in the development of a PETTaL model that captured the salient factors of the data. This model incorporated usability theory from the Human Computer Interaction literature, and education theory and models such as Mishra and Koehler’s (2006) TPACK model, where the grounded data indicated these issues. The PETTaL model identifies Power (school management, syllabus etc.), Environment (classroom / learning setting), Teacher (personal characteristics, experience, epistemology), Technology (usability, versatility etc.,) and Learners (academic ability, diversity, behaviour etc.,) as fields that can impact the use of technology in science classrooms. The PETTaL model was used to create a Predictive Evaluation Tool (PET): a tool designed to assist teachers in choosing technologies, particularly for science teaching and learning. The evolution of the PET was cyclical (employing agile development methodology), involving repeated testing with in-service and pre-service teachers at each iteration, and incorporating their comments i ii in subsequent versions. Once no new suggestions were forthcoming, the PET was tested with eight in-service teachers, and the results showed that the PET outcomes obtained by (experienced) teachers concurred with their instinctive evaluations. They felt the PET would be a valuable tool when considering new technology, and it would be particularly useful as a means of communicating perceived value between colleagues and between budget holders and requestors during the acquisition process. It is hoped that the PET could make the tacit knowledge acquired by experienced teachers about technology use in classrooms explicit to novice teachers. Additionally, the PET could be used as a research tool to discover a teachers’ professional development needs. Therefore, the outcomes of this study can aid a teacher in the process of selecting educationally productive and sustainable new technology for their science classrooms. This study has produced an instrument for assisting teachers in the decision-making process associated with the use of new technologies for the science classroom. The instrument is generic in that it can be applied to all subject areas. Further, this study has produced a powerful model that extends the TPACK model, which is currently extensively employed to assess teachers’ use of technology in the classroom. The PETTaL model grounded in data from this study, responds to the calls in the literature for TPACK’s further development. As a theoretical model, PETTaL has the potential to serve as a framework for the development of a teacher’s reflective practice (either self evaluation or critical evaluation of observed teaching practices). Additionally, PETTaL has the potential for aiding the formulation of a teacher’s personal professional development plan. It will be the basis for further studies in this field.
Resumo:
BACKGROUND Mosquito-borne diseases are climate sensitive and there has been increasing concern over the impact of climate change on future disease risk. This paper projected the potential future risk of Barmah Forest virus (BFV) disease under climate change scenarios in Queensland, Australia. METHODS/PRINCIPAL FINDINGS We obtained data on notified BFV cases, climate (maximum and minimum temperature and rainfall), socio-economic and tidal conditions for current period 2000-2008 for coastal regions in Queensland. Grid-data on future climate projections for 2025, 2050 and 2100 were also obtained. Logistic regression models were built to forecast the otential risk of BFV disease distribution under existing climatic, socio-economic and tidal conditions. The model was applied to estimate the potential geographic distribution of BFV outbreaks under climate change scenarios. The predictive model had good model accuracy, sensitivity and specificity. Maps on potential risk of future BFV disease indicated that disease would vary significantly across coastal regions in Queensland by 2100 due to marked differences in future rainfall and temperature projections. CONCLUSIONS/SIGNIFICANCE We conclude that the results of this study demonstrate that the future risk of BFV disease would vary across coastal regions in Queensland. These results may be helpful for public health decision making towards developing effective risk management strategies for BFV disease control and prevention programs in Queensland.
Resumo:
Caveolae and their proteins, the caveolins, transport macromolecules; compartmentalize signalling molecules; and are involved in various repair processes. There is little information regarding their role in the pathogenesis of significant renal syndromes such as acute renal failure (ARF). In this study, an in vivo rat model of 30 min bilateral renal ischaemia followed by reperfusion times from 4 h to 1 week was used to map the temporal and spatial association between caveolin-1 and tubular epithelial damage (desquamation, apoptosis, necrosis). An in vitro model of ischaemic ARF was also studied, where cultured renal tubular epithelial cells or arterial endothelial cells were subjected to injury initiators modelled on ischaemia-reperfusion (hypoxia, serum deprivation, free radical damage or hypoxia-hyperoxia). Expression of caveolin proteins was investigated using immunohistochemistry, immunoelectron microscopy, and immunoblots of whole cell, membrane or cytosol protein extracts. In vivo, healthy kidney had abundant caveolin-1 in vascular endothelial cells and also some expression in membrane surfaces of distal tubular epithelium. In the kidneys of ARF animals, punctate cytoplasmic localization of caveolin-1 was identified, with high intensity expression in injured proximal tubules that were losing basement membrane adhesion or were apoptotic, 24 h to 4 days after ischaemia-reperfusion. Western immunoblots indicated a marked increase in caveolin-1 expression in the cortex where some proximal tubular injury was located. In vitro, the main treatment-induced change in both cell types was translocation of caveolin-1 from the original plasma membrane site into membrane-associated sites in the cytoplasm. Overall, expression levels did not alter for whole cell extracts and the protein remained membrane-bound, as indicated by cell fractionation analyses. Caveolin-1 was also found to localize intensely within apoptotic cells. The results are indicative of a role for caveolin-1 in ARF-induced renal injury. Whether it functions for cell repair or death remains to be elucidated.
Resumo:
Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.
Resumo:
Social networking sites (SNSs), with their large numbers of users and large information base, seem to be perfect breeding grounds for exploiting the vulnerabilities of people, the weakest link in security. Deceiving, persuading, or influencing people to provide information or to perform an action that will benefit the attacker is known as “social engineering.” While technology-based security has been addressed by research and may be well understood, social engineering is more challenging to understand and manage, especially in new environments such as SNSs, owing to some factors of SNSs that reduce the ability of users to detect the attack and increase the ability of attackers to launch it. This work will contribute to the knowledge of social engineering by presenting the first two conceptual models of social engineering attacks in SNSs. Phase-based and source-based models are presented, along with an intensive and comprehensive overview of different aspects of social engineering threats in SNSs.
Resumo:
We describe recent biologically-inspired mapping research incorporating brain-based multi-sensor fusion and calibration processes and a new multi-scale, homogeneous mapping framework. We also review the interdisciplinary approach to the development of the RatSLAM robot mapping and navigation system over the past decade and discuss the insights gained from combining pragmatic modelling of biological processes with attempts to close the loop back to biology. Our aim is to encourage the pursuit of truly interdisciplinary approaches to robotics research by providing successful case studies.
Resumo:
MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.
Resumo:
Genomic instability underlies the transformation of host cells toward malignancy, promotes development of invasion and metastasis and shapes the response of established cancer to treatment. In this review, we discuss recent advances in our understanding of genomic stability in squamous cell carcinoma of the head and neck (HNSCC), with an emphasis on DNA repair pathways. HNSCC is characterized by distinct profiles in genome stability between similarly staged cancers that are reflected in risk, treatment response and outcomes. Defective DNA repair generates chromosomal derangement that can cause subsequent alterations in gene expression, and is a hallmark of progression toward carcinoma. Variable functionality of an increasing spectrum of repair gene polymorphisms is associated with increased cancer risk, while aetiological factors such as human papillomavirus, tobacco and alcohol induce significantly different behaviour in induced malignancy, underpinned by differences in genomic stability. Targeted inhibition of signalling receptors has proven to be a clinically-validated therapy, and protein expression of other DNA repair and signalling molecules associated with cancer behaviour could potentially provide a more refined clinical model for prognosis and treatment prediction. Development and expansion of current genomic stability models is furthering our understanding of HNSCC pathophysiology and uncovering new, promising treatment strategies. © 2013 Glenn Jenkins et al.
Resumo:
Many newspapers and magazines have added “social media features” to their web-based information services in order to allow users to participate in the production of content. This study examines the specific impact of the firm’s investment in social media features on their online business models. We make a comparative case study of four Scandinavian print media firms that have added social media features to their online services. We show how social media features lead to online business model innovation, particularly linked to the firms’ value propositions. The paper discusses the repercussions of this transformation on firms’ relationship with consumers and with traditional content contributors. The modified value proposition also requires firms to acquire new competences in order to reap full benefit of their social media investments. We show that the firms have been unable to do so since they have not allowed the social media features to affect their online revenue models.