872 resultados para Online and off-line diagnosis and monitoring methods
Resumo:
The Telehealth Brazil Networks Program, created in 2007 with the aim of strengthening primary care and the unified health system (SUS - Sistema Único de Saúde), uses information and communication technologies for distance learning activities related to health. The use of technology enables the interaction between health professionals and / or their patients, furthering the ability of Family Health Teams (FHT). The program is grounded in law, which determines a number of technologies, protocols and processes which guide the work of Telehealth nucleus in the provision of services to the population. Among these services is teleconsulting, which is registered consultation and held between workers, professionals and managers of healthcare through bidirectional telecommunication instruments, in order to answer questions about clinical procedures, health actions and questions on the dossier of work. With the expansion of the program in 2011, was possible to detect problems and challenges that cover virtually all nucleus at different scales for each region. Among these problems can list the heterogeneity of platforms, especially teleconsulting, and low internet coverage in the municipalities, mainly in the interior cities of Brazil. From this perspective, the aim of this paper is to propose a distributed architecture, using mobile computing to enable the sending of teleconsultation. This architecture works offline, so that when internet connection data will be synchronized with the server. This data will travel on compressed to reduce the need for high transmission rates. Any Telehealth Nucleus can use this architecture, through an external service, which will be coupled through a communication interface.
Resumo:
In this work, we introduce the periodic nonlinear Fourier transform (PNFT) method as an alternative and efficacious tool for compensation of the nonlinear transmission effects in optical fiber links. In the Part I, we introduce the algorithmic platform of the technique, describing in details the direct and inverse PNFT operations, also known as the inverse scattering transform for periodic (in time variable) nonlinear Schrödinger equation (NLSE). We pay a special attention to explaining the potential advantages of the PNFT-based processing over the previously studied nonlinear Fourier transform (NFT) based methods. Further, we elucidate the issue of the numerical PNFT computation: we compare the performance of four known numerical methods applicable for the calculation of nonlinear spectral data (the direct PNFT), in particular, taking the main spectrum (utilized further in Part II for the modulation and transmission) associated with some simple example waveforms as the quality indicator for each method. We show that the Ablowitz-Ladik discretization approach for the direct PNFT provides the best performance in terms of the accuracy and computational time consumption.
Resumo:
Marine- and terrestrial-derived biomarkers (alkenones, brassicasterol, dinosterol, and long-chain n-alkanes), as well as carbonate, biogenic opal, and ice-rafted debris (IRD), were measured in two sediment cores in the Sea of Okhotsk, which is located in the northwestern Pacific rim and characterized by high primary productivity. Down-core profiles of phytoplankton markers suggest that primary productivity abruptly increased during the global Meltwater Pulse events 1A (about 14 ka) and 1B (about 11 ka) and stayed high in the Holocene. Spatial and temporal distributions of the phytoplankton productivity were found to be consistent with changes in the reconstructed sea ice distribution on the basis of the IRD. This demonstrates that the progress and retreat of sea ice regulated primary productivity in the Sea of Okhotsk with minimum productivity during the glacial period. The mass accumulation rates of alkenones, CaCO3, and biogenic opal indicate that the dominant phytoplankton species during deglaciation was the coccolithophorid, Emiliania huxleyi, which was replaced by diatoms in the late Holocene. Such a phytoplankton succession was probably caused by an increase in silicate supply to the euphotic layer, possibly associated with a change in surface hydrography and/or linked to enhanced upwelling of North Pacific Deep Water.
Resumo:
The BlackEnergy malware targeting critical infrastructures has a long history. It evolved over time from a simple DDoS platform to a quite sophisticated plug-in based malware. The plug-in architecture has a persistent malware core with easily installable attack specific modules for DDoS, spamming, info-stealing, remote access, boot-sector formatting etc. BlackEnergy has been involved in several high profile cyber physical attacks including the recent Ukraine power grid attack in December 2015. This paper investigates the evolution of BlackEnergy and its cyber attack capabilities. It presents a basic cyber attack model used by BlackEnergy for targeting industrial control systems. In particular, the paper analyzes cyber threats of BlackEnergy for synchrophasor based systems which are used for real-time control and monitoring functionalities in smart grid. Several BlackEnergy based attack scenarios have been investigated by exploiting the vulnerabilities in two widely used synchrophasor communication standards: (i) IEEE C37.118 and (ii) IEC 61850-90-5. Specifically, the paper addresses reconnaissance, DDoS, man-in-the-middle and replay/reflection attacks on IEEE C37.118 and IEC 61850-90-5. Further, the paper also investigates protection strategies for detection and prevention of BlackEnergy based cyber physical attacks.
Resumo:
The Mobile Information Literacy curriculum is a growing collection of training materials designed to build literacies for the millions of people worldwide coming online every month via a mobile phone. Most information information and digital literacy curricula were designed for a PC age, and public and private organizations around the world have used these curricula to help newcomers use computers and the internet effectively and safely. The better curricula address not only skills, but also concepts and attitudes. The central question for this project is: what are the relevant skills, concepts, and attitudes for people using mobiles, not PCs, to access the internet? As part of the Information Strategies for Societies in Transition project, we developed a six-module curriculum for mobile-first users. The project is situated in Myanmar, a country undergoing massive political, economic, and social changes, and where mobile penetration is expected to reach 80% by the end of 2015 from just 4% in 2014. Combined with the country’s history of media censorship, Myanmar presents unique challenges for addressing the needs of people who need the ability to find and evaluate the quality and credibility of information obtained online, understand how to create and share online information effectively, and participate safely and securely.
Resumo:
In Czech schools two teaching methods of reading are used: the analytic-synthetic (conventional) and genetic (created in the 1990s). They differ in theoretical foundations and in methodology. The aim of this paper is to describe the above mentioned theoretical approaches and present the results of study that followed the differences in the development of initial reading skills between these methods. A total of 452 first grade children (age 6-8) were assessed by a battery of reading tests at the beginning and at the end of the first grade and at the beginning of the second grade. 350 pupils participated all three times. Based on data analysis the developmental dynamics of reading skills in both methods and the main differences in several aspects of reading abilities (e.g. the speed of reading, reading technique, error rate in reading) are described. The main focus is on the reading comprehension development. Results show that pupils instructed using genetic approach scored significantly better on used reading comprehension tests, especially in the first grade. Statistically significant differences occurred between classes independently of each method. Therefore, other factors such as teacher´s role and class composition are discussed.
Study of white spot disease in four native species in Persian Gulf by histopathology and PCR methods
Resumo:
After serious disease outbreak, caused by new virus (WSV), has been occurring among cultured penaeid shrimps in Asian countries like China since 1993 and then in Latin American countries, during June till July 2002 a rapid and high mortality in cultured Penaeus indicus in Abadan region located in south of Iran with typical signs and symptoms of White Spot Syndrome Virus was confirmed by different studies of Histopathology, PCR, TEM, Virology. This study was conducted for the purpose of determination of prevalence(rate of infection)/ROI and grading severity (SOI) of WSD to five species: 150 samples of captured shrimps and 90 samples of cultured ones; Penaeus indicus, P. semisulcatus, P. merguiensis, Parapenaopsis styliferus, and Metapenaeus affinis in 2005. 136 of 240 samples have shown clinical and macroscopical signs & symptoms including; white spots on carapase (0.5-2 mm), easily removing of cuticule, fragility of hepatopancreas and red color of motility limbs. Histopathological changes like specific intranuclear inclusion bodies (cowdry-type A) were observed in all target tissues (gill, epidermis, haemolymph and midgut) but not in hepatopancreas, among shrimps collected from various farms in the south and captured ones from Persian Gulf, even ones without clinical signs. ROI among species estimated, using the NATIVIDAD & LIGHTNER formula(1992b) and SOI were graded, using a generalized scheme for assigning a numerical qualitative value to severity grade of infection which was provided by LIGHTNER(1996), in consideration to histopathology and counting specific inclusion bodies in different stages(were modified by B. Gholamhoseini). Samples with clinical signs, showed grades more than 2. Most of the P. semisulcatus and M. affinis samples showed grade of 3, in the other hand in most of P. styliferus samples grade of 4 were observed, which can suggest different sensitivity of different species. All samples were tested by Nested PCR method with IQTm 2000 WSSV kit and 183 of 240 samples were positive and 3 1evel of infection which was shown in this PCR confirmed our SOI grades, but they were more specified.
Resumo:
Lake Kyoga at the time of Worthington Survey (Worthington, 1929) was fished by only natives around it. The fishing gears consisted of locally made basket traps, hooks and seine nets made out of papyrus. Fishing was mainly during the dry season as in wet season, the fishers would revert to crop growing. During 1937 to 1950s Oreochromis variabilis, oreochromis esculentus (Ngege) and Protopterus aethiopicus (Mamba) were the most important commercial fish species and contributed over 95% to the total landings until early 1950s when their proportions started to change as a result of changes in fishing techniques. The tilapiines' were then being caught using mainly basket traps and P.aethiopicus was caught in hooks prior to the mid 1950s.
Resumo:
Participation Space Studies explore eParticipation in the day-to-day activities of local, citizen-led groups, working to improve their communities. The focus is the relationship between activities and contexts. The concept of a participation space is introduced in order to reify online and offline contexts where people participate in democracy. Participation spaces include websites, blogs, email, social media presences, paper media, and physical spaces. They are understood as sociotechnical systems: assemblages of heterogeneous elements, with relevant histories and trajectories of development and use. This approach enables the parallel study of diverse spaces, on and offline. Participation spaces are investigated within three case studies, centred on interviews and participant observation. Each case concerns a community or activist group, in Scotland. The participation spaces are then modelled using a Socio-Technical Interaction Network (STIN) framework (Kling, McKim and King, 2003). The participation space concept effectively supports the parallel investigation of the diverse social and technical contexts of grassroots democracy and the relationship between the case-study groups and the technologies they use to support their work. Participants’ democratic participation is supported by online technologies, especially email, and they create online communities and networks around their goals. The studies illustrate the mutual shaping relationship between technology and democracy. Participants’ choice of technologies can be understood in spatial terms: boundaries, inhabitants, access, ownership, and cost. Participation spaces and infrastructures are used together and shared with other groups. Non-public online spaces, such as Facebook groups, are vital contexts for eParticipation; further, the majority of participants’ work is non-public, on and offline. It is informational, potentially invisible, work that supports public outputs. The groups involve people and influence events through emotional and symbolic impact, as well as rational argument. Images are powerful vehicles for this and digital images become an increasingly evident and important feature of participation spaces throughout the consecutively conducted case studies. Collaboration of diverse people via social media indicates that these spaces could be understood as boundary objects (Star and Griesemer, 1989). The Participation Space Studies draw from and contribute to eParticipation, social informatics, mediation, social shaping studies, and ethnographic studies of Internet use.
Resumo:
Recent developments in automation, robotics and artificial intelligence have given a push to a wider usage of these technologies in recent years, and nowadays, driverless transport systems are already state-of-the-art on certain legs of transportation. This has given a push for the maritime industry to join the advancement. The case organisation, AAWA initiative, is a joint industry-academia research consortium with the objective of developing readiness for the first commercial autonomous solutions, exploiting state-of-the-art autonomous and remote technology. The initiative develops both autonomous and remote operation technology for navigation, machinery, and all on-board operating systems. The aim of this study is to develop a model with which to estimate and forecast the operational costs, and thus enable comparisons between manned and autonomous cargo vessels. The building process of the model is also described and discussed. Furthermore, the model’s aim is to track and identify the critical success factors of the chosen ship design, and to enable monitoring and tracking of the incurred operational costs as the life cycle of the vessel progresses. The study adopts the constructive research approach, as the aim is to develop a construct to meet the needs of a case organisation. Data has been collected through discussions and meeting with consortium members and researchers, as well as through written and internal communications material. The model itself is built using activity-based life cycle costing, which enables both realistic cost estimation and forecasting, as well as the identification of critical success factors due to the process-orientation adopted from activity-based costing and the statistical nature of Monte Carlo simulation techniques. As the model was able to meet the multiple aims set for it, and the case organisation was satisfied with it, it could be argued that activity-based life cycle costing is the method with which to conduct cost estimation and forecasting in the case of autonomous cargo vessels. The model was able to perform the cost analysis and forecasting, as well as to trace the critical success factors. Later on, it also enabled, albeit hypothetically, monitoring and tracking of the incurred costs. By collecting costs this way, it was argued that the activity-based LCC model is able facilitate learning from and continuous improvement of the autonomous vessel. As with the building process of the model, an individual approach was chosen, while still using the implementation and model building steps presented in existing literature. This was due to two factors: the nature of the model and – perhaps even more importantly – the nature of the case organisation. Furthermore, the loosely organised network structure means that knowing the case organisation and its aims is of great importance when conducting a constructive research.