129 resultados para standalone


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report seeks to identify the features of good practice in the development, assessment and evaluation of digital literacy for graduate employability.

The report forms Stage 2 of a two-stage literature review. The results of the first stage of the review are reported in: Towards an understanding of ‘Digital Literacy(ies)’. The current report draws on some of the same literature to that of the Stage 1 report and covers the same time period – up to the end of 2012. In addition, the current report covers literature that provides discussion and accounts of good practice in digital literacy – particularly practice that is embedded in course curricula.

The literature provides numerous examples of standalone digital literacy practices. While such practices may be effective for some purposes, they may be less effective than course-integrated practices in contributing to graduate employability. However, there are few accounts of course-integrated practices in the literature and fewer still that provide a compelling case for their positive contribution to graduate outcomes. Accordingly, the report identifies eight criteria of good practice in digital literacy for the assurance of graduate learning outcomes. It then identifies types of broad teaching and learning practices that appear to best encompass these criteria and which provide meaningful contexts in which to develop the digital literacy competencies of students.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many aspects of our modern society now have either a direct or implicit dependence upon information technology. As such, a compromise of the availability or integrity in relation to these systems (which may encompass such diverse domains as banking, government, health care, and law enforcement) could have dramatic consequences from a societal perspective. These key systems are often referred to as critical infrastructure. Critical infrastructure can consist of corporate information systems or systems that control key industrial processes; these specific systems are referred to as ICS (Industry Control Systems) systems. ICS systems have devolved since the 1960s from standalone systems to networked architectures that communicate across large distances, utilise wireless network and can be controlled via the Internet. ICS systems form part of many countries’ key critical infrastructure, including Australia. They are used to remotely monitor and control the delivery of essential services and products, such as electricity, gas, water, waste treatment and transport systems. The need for security measures within these systems was not anticipated in the early development stages as they were designed to be closed systems and not open systems to be accessible via the Internet. We are also seeing these ICS and their supporting systems being integrated into organisational corporate systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The number of research papers available today is growing at a staggering rate, generating a huge amount of information that people cannot keep up with. According to a tendency indicated by the United States’ National Science Foundation, more than 10 million new papers will be published in the next 20 years. Because most of these papers will be available on the Web, this research focus on exploring issues on recommending research papers to users, in order to directly lead users to papers of their interest. Recommender systems are used to recommend items to users among a huge stream of available items, according to users’ interests. This research focuses on the two most prevalent techniques to date, namely Content-Based Filtering and Collaborative Filtering. The first explores the text of the paper itself, recommending items similar in content to the ones the user has rated in the past. The second explores the citation web existing among papers. As these two techniques have complementary advantages, we explored hybrid approaches to recommending research papers. We created standalone and hybrid versions of algorithms and evaluated them through both offline experiments on a database of 102,295 papers, and an online experiment with 110 users. Our results show that the two techniques can be successfully combined to recommend papers. The coverage is also increased at the level of 100% in the hybrid algorithms. In addition, we found that different algorithms are more suitable for recommending different kinds of papers. Finally, we verified that users’ research experience influences the way users perceive recommendations. In parallel, we found that there are no significant differences in recommending papers for users from different countries. However, our results showed that users’ interacting with a research paper Recommender Systems are much happier when the interface is presented in the user’s native language, regardless the language that the papers are written. Therefore, an interface should be tailored to the user’s mother language.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the analysis of some usual MPPT (Maximum Power Point Tracking) strategies intended for small wind energy conversion (up to 1kW) based on permanent magnet synchronous generators (PMSG), considering the stand-alone application for a novel buck-boost integrated inverter. Each MPPT method is analytically introduced and then it is simulated using MatLab/Simulink considering standard conditions of wind and also commercially available turbines and generators. The extracted power in each case is compared with the maximum available power, so the tracking factor is calculated for each method. Thus, the focus is on the application to improve the efficiency of stand-alone wind energy conversion systems (WECS) with battery chargers and AC load supplied by inverter. Therefore, for this purpose a novel single phase buck-boost integrated inverter is introduced. Finally, the main experimental results for the introduced inverter are presented. © 2011 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Informação - FFC

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta dissertação de mestrado apresenta o projeto e a construção de um robô móvel terrestre denominado LOGBOT, com tração de movimento do tipo diferencial – com duas rodas motoras e uma roda livre para manter a estabilidade de sua estrutura em relação à superfície. O controle do robô dispõe dos modos de telemetria e autônomo. No modo de controle por telemetria (ROV), a comunicação do robô com a estação de controle é feita por radiofreqüência a uma distância de até um quilometro em ambientes externos, e até cem metros em ambientes internos. No modo de controle autônomo (AGV), o robô tem habilidade para navegar em ambientes internos e desconhecidos usando sempre a parede à sua esquerda como referência para a trajetória de seu movimento. A seqüência de movimentos para execução da trajetória é enviada para a estação de controle que realiza análises de desempenho do robô. Para executar suas tarefas no modo autônomo, a programação do robô conta com um agente inteligente reativo, que detecta características do ambiente (obstáculos, final de paredes, etc.) e decide sobre qual atitude deve ser executada pelo robô, com objetivo de contornar os obstáculos e controlar a velocidade de suas rodas. Os problemas de erro odométrico e suas correções com base no uso de informações sensoriais externas são devidamente tratados. Técnicas de controle hierárquico do robô como um todo e controle em malha fechada da velocidade das rodas do robô são usadas. Os resultados mostraram que o robô móvel LOGBOT é capaz de navegar, com estabilidade e precisão, em ambientes internos no formato de um corredor (wall following).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Elétrica - FEIS

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Proteinaceous toxins are observed across all levels of inter-organismal and intra-genomic conflicts. These include recently discovered prokaryotic polymorphic toxin systems implicated in intra-specific conflicts. They are characterized by a remarkable diversity of C-terminal toxin domains generated by recombination with standalone toxin-coding cassettes. Prior analysis revealed a striking diversity of nuclease and deaminase domains among the toxin modules. We systematically investigated polymorphic toxin systems using comparative genomics, sequence and structure analysis. Results: Polymorphic toxin systems are distributed across all major bacterial lineages and are delivered by at least eight distinct secretory systems. In addition to type-II, these include type-V, VI, VII (ESX), and the poorly characterized "Photorhabdus virulence cassettes (PVC)", PrsW-dependent and MuF phage-capsid-like systems. We present evidence that trafficking of these toxins is often accompanied by autoproteolytic processing catalyzed by HINT, ZU5, PrsW, caspase-like, papain-like, and a novel metallopeptidase associated with the PVC system. We identified over 150 distinct toxin domains in these systems. These span an extraordinary catalytic spectrum to include 23 distinct clades of peptidases, numerous previously unrecognized versions of nucleases and deaminases, ADP-ribosyltransferases, ADP ribosyl cyclases, RelA/SpoT-like nucleotidyltransferases, glycosyltranferases and other enzymes predicted to modify lipids and carbohydrates, and a pore-forming toxin domain. Several of these toxin domains are shared with host-directed effectors of pathogenic bacteria. Over 90 families of immunity proteins might neutralize anywhere between a single to at least 27 distinct types of toxin domains. In some organisms multiple tandem immunity genes or immunity protein domains are organized into polyimmunity loci or polyimmunity proteins. Gene-neighborhood-analysis of polymorphic toxin systems predicts the presence of novel trafficking-related components, and also the organizational logic that allows toxin diversification through recombination. Domain architecture and protein-length analysis revealed that these toxins might be deployed as secreted factors, through directed injection, or via inter-cellular contact facilitated by filamentous structures formed by RHS/YD, filamentous hemagglutinin and other repeats. Phyletic pattern and life-style analysis indicate that polymorphic toxins and polyimmunity loci participate in cooperative behavior and facultative 'cheating' in several ecosystems such as the human oral cavity and soil. Multiple domains from these systems have also been repeatedly transferred to eukaryotes and their viruses, such as the nucleo-cytoplasmic large DNA viruses. Conclusions: Along with a comprehensive inventory of toxins and immunity proteins, we present several testable predictions regarding active sites and catalytic mechanisms of toxins, their processing and trafficking and their role in intra-specific and inter-specific interactions between bacteria. These systems provide insights regarding the emergence of key systems at different points in eukaryotic evolution, such as ADP ribosylation, interaction of myosin VI with cargo proteins, mediation of apoptosis, hyphal heteroincompatibility, hedgehog signaling, arthropod toxins, cell-cell interaction molecules like teneurins and different signaling messengers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aims to compare and validate two soil-vegetation-atmosphere-transfer (SVAT) schemes: TERRA-ML and the Community Land Model (CLM). Both SVAT schemes are run in standalone mode (decoupled from an atmospheric model) and forced with meteorological in-situ measurements obtained at several tropical African sites. Model performance is quantified by comparing simulated sensible and latent heat fluxes with eddy-covariance measurements. Our analysis indicates that the Community Land Model corresponds more closely to the micrometeorological observations, reflecting the advantages of the higher model complexity and physical realism. Deficiencies in TERRA-ML are addressed and its performance is improved: (1) adjusting input data (root depth) to region-specific values (tropical evergreen forest) resolves dry-season underestimation of evapotranspiration; (2) adjusting the leaf area index and albedo (depending on hard-coded model constants) resolves overestimations of both latent and sensible heat fluxes; and (3) an unrealistic flux partitioning caused by overestimated superficial water contents is reduced by adjusting the hydraulic conductivity parameterization. CLM is by default more versatile in its global application on different vegetation types and climates. On the other hand, with its lower degree of complexity, TERRA-ML is much less computationally demanding, which leads to faster calculation times in a coupled climate simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bioinformatics is a recent and emerging discipline which aims at studying biological problems through computational approaches. Most branches of bioinformatics such as Genomics, Proteomics and Molecular Dynamics are particularly computationally intensive, requiring huge amount of computational resources for running algorithms of everincreasing complexity over data of everincreasing size. In the search for computational power, the EGEE Grid platform, world's largest community of interconnected clusters load balanced as a whole, seems particularly promising and is considered the new hope for satisfying the everincreasing computational requirements of bioinformatics, as well as physics and other computational sciences. The EGEE platform, however, is rather new and not yet free of problems. In addition, specific requirements of bioinformatics need to be addressed in order to use this new platform effectively for bioinformatics tasks. In my three years' Ph.D. work I addressed numerous aspects of this Grid platform, with particular attention to those needed by the bioinformatics domain. I hence created three major frameworks, Vnas, GridDBManager and SETest, plus an additional smaller standalone solution, to enhance the support for bioinformatics applications in the Grid environment and to reduce the effort needed to create new applications, additionally addressing numerous existing Grid issues and performing a series of optimizations. The Vnas framework is an advanced system for the submission and monitoring of Grid jobs that provides an abstraction with reliability over the Grid platform. In addition, Vnas greatly simplifies the development of new Grid applications by providing a callback system to simplify the creation of arbitrarily complex multistage computational pipelines and provides an abstracted virtual sandbox which bypasses Grid limitations. Vnas also reduces the usage of Grid bandwidth and storage resources by transparently detecting equality of virtual sandbox files based on content, across different submissions, even when performed by different users. BGBlast, evolution of the earlier project GridBlast, now provides a Grid Database Manager (GridDBManager) component for managing and automatically updating biological flatfile databases in the Grid environment. GridDBManager sports very novel features such as an adaptive replication algorithm that constantly optimizes the number of replicas of the managed databases in the Grid environment, balancing between response times (performances) and storage costs according to a programmed cost formula. GridDBManager also provides a very optimized automated management for older versions of the databases based on reverse delta files, which reduces the storage costs required to keep such older versions available in the Grid environment by two orders of magnitude. The SETest framework provides a way to the user to test and regressiontest Python applications completely scattered with side effects (this is a common case with Grid computational pipelines), which could not easily be tested using the more standard methods of unit testing or test cases. The technique is based on a new concept of datasets containing invocations and results of filtered calls. The framework hence significantly accelerates the development of new applications and computational pipelines for the Grid environment, and the efforts required for maintenance. An analysis of the impact of these solutions will be provided in this thesis. This Ph.D. work originated various publications in journals and conference proceedings as reported in the Appendix. Also, I orally presented my work at numerous international conferences related to Grid and bioinformatics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The term Ambient Intelligence (AmI) refers to a vision on the future of the information society where smart, electronic environment are sensitive and responsive to the presence of people and their activities (Context awareness). In an ambient intelligence world, devices work in concert to support people in carrying out their everyday life activities, tasks and rituals in an easy, natural way using information and intelligence that is hidden in the network connecting these devices. This promotes the creation of pervasive environments improving the quality of life of the occupants and enhancing the human experience. AmI stems from the convergence of three key technologies: ubiquitous computing, ubiquitous communication and natural interfaces. Ambient intelligent systems are heterogeneous and require an excellent cooperation between several hardware/software technologies and disciplines, including signal processing, networking and protocols, embedded systems, information management, and distributed algorithms. Since a large amount of fixed and mobile sensors embedded is deployed into the environment, the Wireless Sensor Networks is one of the most relevant enabling technologies for AmI. WSN are complex systems made up of a number of sensor nodes which can be deployed in a target area to sense physical phenomena and communicate with other nodes and base stations. These simple devices typically embed a low power computational unit (microcontrollers, FPGAs etc.), a wireless communication unit, one or more sensors and a some form of energy supply (either batteries or energy scavenger modules). WNS promises of revolutionizing the interactions between the real physical worlds and human beings. Low-cost, low-computational power, low energy consumption and small size are characteristics that must be taken into consideration when designing and dealing with WSNs. To fully exploit the potential of distributed sensing approaches, a set of challengesmust be addressed. Sensor nodes are inherently resource-constrained systems with very low power consumption and small size requirements which enables than to reduce the interference on the physical phenomena sensed and to allow easy and low-cost deployment. They have limited processing speed,storage capacity and communication bandwidth that must be efficiently used to increase the degree of local ”understanding” of the observed phenomena. A particular case of sensor nodes are video sensors. This topic holds strong interest for a wide range of contexts such as military, security, robotics and most recently consumer applications. Vision sensors are extremely effective for medium to long-range sensing because vision provides rich information to human operators. However, image sensors generate a huge amount of data, whichmust be heavily processed before it is transmitted due to the scarce bandwidth capability of radio interfaces. In particular, in video-surveillance, it has been shown that source-side compression is mandatory due to limited bandwidth and delay constraints. Moreover, there is an ample opportunity for performing higher-level processing functions, such as object recognition that has the potential to drastically reduce the required bandwidth (e.g. by transmitting compressed images only when something ‘interesting‘ is detected). The energy cost of image processing must however be carefully minimized. Imaging could play and plays an important role in sensing devices for ambient intelligence. Computer vision can for instance be used for recognising persons and objects and recognising behaviour such as illness and rioting. Having a wireless camera as a camera mote opens the way for distributed scene analysis. More eyes see more than one and a camera system that can observe a scene from multiple directions would be able to overcome occlusion problems and could describe objects in their true 3D appearance. In real-time, these approaches are a recently opened field of research. In this thesis we pay attention to the realities of hardware/software technologies and the design needed to realize systems for distributed monitoring, attempting to propose solutions on open issues and filling the gap between AmI scenarios and hardware reality. The physical implementation of an individual wireless node is constrained by three important metrics which are outlined below. Despite that the design of the sensor network and its sensor nodes is strictly application dependent, a number of constraints should almost always be considered. Among them: • Small form factor to reduce nodes intrusiveness. • Low power consumption to reduce battery size and to extend nodes lifetime. • Low cost for a widespread diffusion. These limitations typically result in the adoption of low power, low cost devices such as low powermicrocontrollers with few kilobytes of RAMand tenth of kilobytes of program memory with whomonly simple data processing algorithms can be implemented. However the overall computational power of the WNS can be very large since the network presents a high degree of parallelism that can be exploited through the adoption of ad-hoc techniques. Furthermore through the fusion of information from the dense mesh of sensors even complex phenomena can be monitored. In this dissertation we present our results in building several AmI applications suitable for a WSN implementation. The work can be divided into two main areas:Low Power Video Sensor Node and Video Processing Alghoritm and Multimodal Surveillance . Low Power Video Sensor Nodes and Video Processing Alghoritms In comparison to scalar sensors, such as temperature, pressure, humidity, velocity, and acceleration sensors, vision sensors generate much higher bandwidth data due to the two-dimensional nature of their pixel array. We have tackled all the constraints listed above and have proposed solutions to overcome the current WSNlimits for Video sensor node. We have designed and developed wireless video sensor nodes focusing on the small size and the flexibility of reuse in different applications. The video nodes target a different design point: the portability (on-board power supply, wireless communication), a scanty power budget (500mW),while still providing a prominent level of intelligence, namely sophisticated classification algorithmand high level of reconfigurability. We developed two different video sensor node: The device architecture of the first one is based on a low-cost low-power FPGA+microcontroller system-on-chip. The second one is based on ARM9 processor. Both systems designed within the above mentioned power envelope could operate in a continuous fashion with Li-Polymer battery pack and solar panel. Novel low power low cost video sensor nodes which, in contrast to sensors that just watch the world, are capable of comprehending the perceived information in order to interpret it locally, are presented. Featuring such intelligence, these nodes would be able to cope with such tasks as recognition of unattended bags in airports, persons carrying potentially dangerous objects, etc.,which normally require a human operator. Vision algorithms for object detection, acquisition like human detection with Support Vector Machine (SVM) classification and abandoned/removed object detection are implemented, described and illustrated on real world data. Multimodal surveillance: In several setup the use of wired video cameras may not be possible. For this reason building an energy efficient wireless vision network for monitoring and surveillance is one of the major efforts in the sensor network community. Energy efficiency for wireless smart camera networks is one of the major efforts in distributed monitoring and surveillance community. For this reason, building an energy efficient wireless vision network for monitoring and surveillance is one of the major efforts in the sensor network community. The Pyroelectric Infra-Red (PIR) sensors have been used to extend the lifetime of a solar-powered video sensor node by providing an energy level dependent trigger to the video camera and the wireless module. Such approach has shown to be able to extend node lifetime and possibly result in continuous operation of the node.Being low-cost, passive (thus low-power) and presenting a limited form factor, PIR sensors are well suited for WSN applications. Moreover techniques to have aggressive power management policies are essential for achieving long-termoperating on standalone distributed cameras needed to improve the power consumption. We have used an adaptive controller like Model Predictive Control (MPC) to help the system to improve the performances outperforming naive power management policies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This text wants to explore the process of bone remodeling. The idea supported is that the signal, the cells acquire and which suggest them to change in their architectural conformation, is the potential difference on the free boundaries surfaces of collagen fibers. These ones represent the bone in the nanoscale. This work has as subject a multiscale model. Lots of studies have been made to try to discover the relationship between a macroscopic external bone load and the cellular scale. The tree first simulations have been a longitudinal, a flexion and a transversal compression force on a full longitudinal fiber 0-0 sample. The results showed first the great difference between a fully longitudinal stress and a flexion stress. Secondly a decrease in the potential difference has been observed in the transversal force configuration, suggesting that such a signal could be taken as the one, who leads the bone remodeling. To also exclude that the obtained results was not to attribute to a piezoelectric collagen effect and not to a mechanical load, different coupling analyses have been developed. Such analyses show this effect is really less important than the one the mechanical load is responsible of. At this point the work had to explore how bone remodeling could develop. The analyses involved different geometry and fibers percentage. Moreover at the beginning the model was to manually implement. The author, after an initial improvement of it, provided to implement a standalone version thanks to integration between Comsol Multiphysic, Matlab and Excel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis describes the implementation of a calibration, format-translation and data conditioning software for radiometric tracking data of deep-space spacecraft. All of the available propagation-media noise rejection techniques available as features in the code are covered in their mathematical formulations, performance and software implementations. Some techniques are retrieved from literature and current state of the art, while other algorithms have been conceived ex novo. All of the three typical deep-space refractive environments (solar plasma, ionosphere, troposphere) are dealt with by employing specific subroutines. Specific attention has been reserved to the GNSS-based tropospheric path delay calibration subroutine, since it is the most bulky module of the software suite, in terms of both the sheer number of lines of code, and development time. The software is currently in its final stage of development and once completed will serve as a pre-processing stage for orbit determination codes. Calibration of transmission-media noise sources in radiometric observables proved to be an essential operation to be performed of radiometric data in order to meet the more and more demanding error budget requirements of modern deep-space missions. A completely autonomous and all-around propagation-media calibration software is a novelty in orbit determination, although standalone codes are currently employed by ESA and NASA. The described S/W is planned to be compatible with the current standards for tropospheric noise calibration used by both these agencies like the AMC, TSAC and ESA IFMS weather data, and it natively works with the Tracking Data Message file format (TDM) adopted by CCSDS as standard aimed to promote and simplify inter-agency collaboration.