877 resultados para mandatory notification
Resumo:
Link quality estimation is a fundamental building block for the design of several different mechanisms and protocols in wireless sensor networks (WSN). A thorough experimental evaluation of link quality estimators (LQEs) is thus mandatory. Several WSN experimental testbeds have been designed ([1–4]) but only [3] and [2] targeted link quality measurements. However, these were exploited for analyzing low-power links characteristics rather than the performance of LQEs. Despite its importance, the experimental performance evaluation of LQEs remains an open problem, mainly due to the difficulty to provide a quantitative evaluation of their accuracy. This motivated us to build a benchmarking testbed for LQE - RadiaLE, which we present here as a demo. It includes (i.) hardware components that represent the WSN under test and (ii.) a software tool for the set up and control of the experiments and also for analyzing the collected data, allowing for LQEs evaluation.
Resumo:
Wireless Sensor Networks (WSNs) are highly distributed systems in which resource allocation (bandwidth, memory) must be performed efficiently to provide a minimum acceptable Quality of Service (QoS) to the regions where critical events occur. In fact, if resources are statically assigned independently from the location and instant of the events, these resources will definitely be misused. In other words, it is more efficient to dynamically grant more resources to sensor nodes affected by critical events, thus providing better network resource management and reducing endto- end delays of event notification and tracking. In this paper, we discuss the use of a WSN management architecture based on the active network management paradigm to provide the real-time tracking and reporting of dynamic events while ensuring efficient resource utilization. The active network management paradigm allows packets to transport not only data, but also program scripts that will be executed in the nodes to dynamically modify the operation of the network. This presumes the use of a runtime execution environment (middleware) in each node to interpret the script. We consider hierarchical (e.g. cluster-tree, two-tiered architecture) WSN topologies since they have been used to improve the timing performance of WSNs as they support deterministic medium access control protocols.
Resumo:
Structural health monitoring has long been identified as a prominent application of Wireless Sensor Networks (WSNs), as traditional wired-based solutions present some inherent limitations such as installation/maintenance cost, scalability and visual impact. Nevertheless, there is a lack of ready-to-use and off-the-shelf WSN technologies that are able to fulfill some most demanding requirements of these applications, which can span from critical physical infrastructures (e.g. bridges, tunnels, mines, energy grid) to historical buildings or even industrial machinery and vehicles. Low-power and low-cost yet extremely sensitive and accurate accelerometer and signal acquisition hardware and stringent time synchronization of all sensors data are just examples of the requirements imposed by most of these applications. This paper presents a prototype system for health monitoring of civil engineering structures that has been jointly conceived by a team of civil, and electrical and computer engineers. It merges the benefits of standard and off-the-shelf (COTS) hardware and communication technologies with a minimum set of custom-designed signal acquisition hardware that is mandatory to fulfill all application requirements.
Resumo:
OBJECTIVE To analyze the patterns and legal requirements of methylphenidate consumption. METHODS We conducted a cross-sectional study of the data from prescription notification forms and balance lists of drugs sales – psychoactive and others – subject to special control in the fifth largest city of Brazil, in 2006. We determined the defined and prescribed daily doses, the average prescription and dispensation periods, and the regional sales distribution in the municipality. In addition, we estimated the costs of drug acquisition and analyzed the individual drug consumption profile using the Lorenz curve. RESULTS The balance lists data covered all notified sales of the drug while data from prescription notification forms covered 50.6% of the pharmacies that sold it, including those with the highest sales volumes. Total methylphenidate consumption was 0.37 DDD/1,000 inhabitants/day. Sales were concentrated in more developed areas, and regular-release tablets were the most commonly prescribed pharmaceutical formulation. In some regions of the city, approximately 20.0% of the prescriptions and dispensation exceeded 30 mg/day and 30 days of treatment. CONCLUSIONS Methylphenidate was widely consumed in the municipality and mainly in the most developed areas. Of note, the consumption of formulations with the higher abuse risk was the most predominant. Both its prescription and dispensation contrasted with current pharmacotherapeutic recommendations and legal requirements. Therefore, the commercialization of methylphenidate should be monitored more closely, and its use in the treatment of behavioral changes of psychological disorders needs to be discussed in detail, in line with the concepts of the quality use of medicines.
Resumo:
OBJECTIVE To assess the validity of dengue fever reports and how they relate to the definition of case and severity. METHODS Diagnostic test assessment was conducted using cross-sectional sampling from a universe of 13,873 patients treated during the fifth epidemiological period in health institutions from 11 Colombian departments in 2013. The test under analyses was the reporting to the National Public Health Surveillance System, and the reference standard was the review of histories identified by active institutional search. We reviewed all histories of patients diagnosed with dengue fever, as well as a random sample of patients with febrile syndromes. The specificity and sensitivity of reports were estimated for this purpose, considering the inverse of the probability of being selected for weighting. The concordance between reporting and the findings of the active institutional search was calculated using Kappa statistics. RESULTS We included 4,359 febrile patients, and 31.7% were classified as compatible with dengue fever (17 with severe dengue fever; 461 with dengue fever and warning signs; 904 with dengue fever and no warning signs). The global sensitivity of reports was 13.2% (95%CI 10.9;15.4) and specificity was 98.4% (95%CI 97.9;98.9). Sensitivity varied according to severity: 12.1% (95%CI 9.3;14.8) for patients presenting dengue fever with no warning signs; 14.5% (95%CI 10.6;18.4) for those presenting dengue fever with warning signs, and 40.0% (95%CI 9.6;70.4) for those with severe dengue fever. Concordance between reporting and the findings of the active institutional search resulted in a Kappa of 10.1%. CONCLUSIONS Low concordance was observed between reporting and the review of clinical histories, which was associated with the low reporting of dengue fever compatible cases, especially milder cases.
Resumo:
OBJECTIVE To analyze the cases of tuberculosis and the impact of direct follow-up on the assessment of treatment outcomes.METHODS This open prospective cohort study evaluated 504 cases of tuberculosis reported in the Sistema de Informação de Agravos de Notificação (SINAN – Notifiable Diseases Information System) in Juiz de Fora, MG, Southeastern Brazil, between 2008 and 2009. The incidence of treatment outcomes was compared between a group of patients diagnosed with tuberculosis and directly followed up by monthly consultations during return visits (287) and a patient group for which the information was indirectly collected (217) through the city’s surveillance system. The Chi-square test was used to compare the percentages, with a significance level of 0.05. The relative risk (RR) was used to evaluate the differences in the incidence rate of each type of treatment outcome between the two groups.RESULTS Of the outcomes directly and indirectly evaluated, 18.5% and 3.2% corresponded to treatment default and 3.8% and 0.5% corresponded to treatment failure, respectively. The incidence of treatment default and failure was higher in the group with direct follow-up (p < 0.05) (RR = 5.72, 95%CI 2.65;12.34, and RR = 8.31, 95%CI 1.08;63.92, respectively).CONCLUSIONS A higher incidence of treatment default and failure was observed in the directly followed up group, and most of these cases were neglected by the disease reporting system. Therefore, effective measures are needed to improve the control of tuberculosis and data quality.
Resumo:
Duchenne muscular dystrophy (DMD) is a severe, progressive disease first described by Meryon in 1852 and later by Guillaume Duchene. It is the most common and severe form of childhood muscular dystrophy, affecting 1 in 3500 live male births. Is caused by an X—linked recessive genetic disorder resulting in a deficiency of the dystrophin protein, responsible for linking contractile proteins to the sarcolemma. Diagnosis is not always easy and the first symptoms are often related to weakness and difficulty or delay in acquiring the ability to perform simple activities. Progressive weakness leads to the use of compensatory strategies in order to maintain the ability to walk and perform other activities. Respiratory muscles are also affected and the complications resulting from its impairments are frequently the cause of early death of these patients. The advances in DMD management has increased life expectancy of these children with the need for adequate care in adulthood. DMD manifestations include muscle weakness, contractures, respiratory and cardiac complications. Some authors also refer that one-third of patients have difficulties with learning and delayed global development because the gene that encodes dystrophyn expresses various dystrophin isoforms that are found in Schwann and Purkinje celis in the brain. Body functions and structure impairments like muscle weakness, contractures and reduced range of motion lead to limitations in activities, i.e., impairments affect the performance of tasks by the individual. In a physiotherapist’s point of view analysing these limitations is mandatory because physiotherapy’s final purpose is to restore or preserve the ability to perform ADL and to improve quality of life.
Resumo:
One of the most common problems of rotating machinery is the rotor unbalance. The effects of rotor unbalance can vary from the malfunction of certain equipment to diseases related to the exposure to high vibration levels. However, the balancing procedure is known, it is mandatory to have qualified technicians to perform it. In this sense, the use of virtual balancing experiments is of great interest. The present demo is dedicated to present two different balancing simulators, which can be explored in conjunction, as they have complementary outputs. © 2014 IEEE.
Resumo:
Prototype validation is a major concern in modern electronic product design and development. Simulation, structural test, functional and timing debug are all forming parts of the validation process, although very often addressed as dissociated tasks. In this paper we describe an integrated approach to board-level prototype validation, based on a set of mandatory/optional BST instructions and a built-in controller for debug and test, that addresses the late mentioned tasks as inherent parts of a whole process
Resumo:
The increasing complexity of VLSI circuits and the reduced accessibility of modern packaging and mounting technologies restrict the usefulness of conventional in-circuit debugging tools, such as in-circuit emulators for microprocessors and microcontrollers. However, this same trend enables the development of more complex products, which in turn require more powerful debugging tools. These conflicting demands could be met if the standard scan test infrastructures now common in most complex components were able to match the debugging requirements of design verification and prototype validation. This paper analyses the main debug requirements in the design of microprocessor-based applications and the feasibility of their implementation using the mandatory, optional and additional operating modes of the standard IEEE 1149.1 test infrastructure.
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Electrotécnica e de Computadores pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Most of the traditional software and database development approaches tend to be serial, not evolutionary and certainly not agile, especially on data-oriented aspects. Most of the more commonly used methodologies are strict, meaning they’re composed by several stages each with very specific associated tasks. A clear example is the Rational Unified Process (RUP), divided into Business Modeling, Requirements, Analysis & Design, Implementation, Testing and Deployment. But what happens when the needs of a well design and structured plan, meet the reality of a small starting company that aims to build an entire user experience solution. Here resource control and time productivity is vital, requirements are in constant change, and so is the product itself. In order to succeed in this environment a highly collaborative and evolutionary development approach is mandatory. The implications of constant changing requirements imply an iterative development process. Project focus is on Data Warehouse development and business modeling. This area is usually a tricky one. Business knowledge is part of the enterprise, how they work, their goals, what is relevant for analyses are internal business processes. Throughout this document it will be explained why Agile Modeling development was chosen. How an iterative and evolutionary methodology, allowed for reasonable planning and documentation while permitting development flexibility, from idea to product. More importantly how it was applied on the development of a Retail Focused Data Warehouse. A productized Data Warehouse built on the knowledge of not one but several client needs. One that aims not just to store usual business areas but create an innovative sets of business metrics by joining them with store environment analysis, converting Business Intelligence into Actionable Business Intelligence.
Resumo:
Relatório de Estágio apresentado à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Ensino do 1.º e 2.º ciclo do Ensino Básico
Resumo:
Mestrado em Auditoria
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização de Edificações