863 resultados para Public address systems
Resumo:
In Brazil, the newly created Republic expressed interests of the elite increasingly committed to foreign capital. The Rio de Janeiro saw accumulate vast resources in trade and finance, deriving for industrial applications. The city appears as articulator of Brazilian territory and mediation between it and the international market. In the capital of the Republic, the conservative plan would sweep the old city and inaugurate images copied from Europe and installed in the tropics with civilizing purpose. This materialized with infrastructure financing and loans to entrepreneurs in Europe and North America, awarded public service concessions. The project relied on strong support of mayors (members or representatives of the companies involved in the reforms). This work aims to address the relationship between the mayors of Rio de Janeiro during the First Republic and international capital, focusing on strategies for the production of new spaces in motion the modernization of Brazil and its international image.
Resumo:
Breakthrough advances in microprocessor technology and efficient power management have altered the course of development of processors with the emergence of multi-core processor technology, in order to bring higher level of processing. The utilization of many-core technology has boosted computing power provided by cluster of workstations or SMPs, providing large computational power at an affordable cost using solely commodity components. Different implementations of message-passing libraries and system softwares (including Operating Systems) are installed in such cluster and multi-cluster computing systems. In order to guarantee correct execution of message-passing parallel applications in a computing environment other than that originally the parallel application was developed, review of the application code is needed. In this paper, a hybrid communication interfacing strategy is proposed, to execute a parallel application in a group of computing nodes belonging to different clusters or multi-clusters (computing systems may be running different operating systems and MPI implementations), interconnected with public or private IP addresses, and responding interchangeably to user execution requests. Experimental results demonstrate the feasibility of this proposed strategy and its effectiveness, through the execution of benchmarking parallel applications.
Resumo:
Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change. The consortium comprises 12 work packages to address a set of research questions in three areas: Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring. Research area 2: Develop novel strategies to prevent dengue in children. Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change. In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.
Resumo:
The aim of this research was to evaluate economic costs of respiratory and circulatory diseases in the municipality of Cubatao, in the state of Sao Paulo, Brazil. Data on hospital admissions and on missed working days due to hospitalization (for age group 14 to 70 years old) from the database of Sistema Unico de Sa de (SUS - Brazilian National Health System) were used. Results: Based on these data, it was calculated that R$ 22.1 million were spent in the period 2000 to 2009 due to diseases of the respiratory and circulatory systems. Part of these expenses can be directly related to the emission of atmospheric pollutants in the city. In order to estimate the costs related to air pollution, data on Cubatao were compared to data from two other municipalities that are also located at the coast side (Guaruja and Peru be), but which have little industrial activity in comparison to Cubatao. It was verified that, in both, average per capita costs were lower when compared to Cubatao, but that this difference has been decreasing in recent years.
Resumo:
Background: There are no available statistical data about sudden cardiac death in Brazil. Therefore, this study has been conducted to evaluate the incidence of sudden cardiac death in our population and its implications. Methods: The research methodology was based on Thurstone's Law of Comparative Judgment, whose premise is that the more an A stimulus differs from a B stimulus, the greater will be the number of people who will perceive this difference. This technique allows an estimation of actual occurrences from subjective perceptions, when compared to official statistics. Data were collected through telephone interviews conducted with Primary and Secondary Care physicians of the Public Health Service in the Metropolitan Area of Sao Paulo (MASP). Results: In the period from October 19, 2009, to October 28, 2009, 196 interviews were conducted. The incidence of 21,270 cases of sudden cardiac death per year was estimated by linear regression analysis of the physicians responses and data from the Mortality Information System of the Brazilian Ministry of Health, with the following correlation and determination coefficients: r = 0.98 and r2= 0.95 (95% confidence interval 0.81.0, P < 0.05). The lack of waiting list for specialized care and socioadministrative problems were considered the main barriers to tertiary care access. Conclusions: The incidence of sudden cardiac death in the MASP is high, and it was estimated as being higher than all other causes of deaths; the extrapolation technique based on the physicians perceptions was validated; and the most important bureaucratic barriers to patient referral to tertiary care have been identified. (PACE 2012; 35:13261331)
Resumo:
OBJECTIVE: Scarce data are available on the occurrence of symptomatic intracranial hemorrhage related to intravenous thrombolysis for acute stroke in South America. We aimed to address the frequency and clinical predictors of symptomatic intracranial hemorrhage after stroke thrombolysis at our tertiary emergency unit in Brazil. METHOD: We reviewed the clinical and radiological data of 117 consecutive acute ischemic stroke patients treated with intravenous thrombolysis in our hospital between May 2001 and April 2010. We compared our results with those of the Safe Implementation of Thrombolysis in Stroke registry. Univariate and multiple regression analyses were performed to identify factors associated with symptomatic intracranial transformation. RESULTS: In total, 113 cases from the initial sample were analyzed. The median National Institutes of Health Stroke Scale score was 16 (interquartile range: 10-20). The median onset-to-treatment time was 188 minutes (interquartile range: 155-227). There were seven symptomatic intracranial hemorrhages (6.2%; Safe Implementation of Thrombolysis in Stroke registry: 4.9%; p = 0.505). In the univariate analysis, current statin treatment and elevated National Institute of Health Stroke Scale scores were related to symptomatic intracranial hemorrhage. After the multivariate analysis, current statin treatment was the only factor independently associated with symptomatic intracranial hemorrhage. CONCLUSIONS: In this series of Brazilian patients with severe strokes treated with intravenous thrombolysis in a public university hospital at a late treatment window, we found no increase in the rate of symptomatic intracranial hemorrhage. Additional studies are necessary to clarify the possible association between statins and the risk of symptomatic intracranial hemorrhage after stroke thrombolysis.
Resumo:
This study evaluated the five-year clinical performance of ceramic inlays and onlays made with two systems: sintered Duceram (Dentsply-Degussa) and pressable IPS Empress (Ivoclar Vivadent). Eighty-six restorations were placed by a single operator in 35 patients with a median age of 33 years. The restorations were cemented with dual-cured resin cement (Variolink II, Ivoclar Vivadent) and Syntac Classic adhesive under rubber dam. The evaluations were conducted by two independent investigators at baseline, and at one, two, three, and five years using the modified United States Public Health Service (USPHS) criteria. At the five-year recall, 26 patients were evaluated (74.28%), totalling 62 (72.09%) restorations. Four IPS restorations were fractured, two restorations presented secondary caries (one from IPS and one from Duceram), and two restorations showed unacceptable defects at the restoration margin and needed replacement (one restoration from each ceramic system). A general success rate of 87% was recorded. The Fisher exact test revealed no significant difference between Duceram and IPS Empress ceramic systems for all aspects evaluated at different recall appointments (p>0.05). The McNemar chi-square test showed significant differences in relation to marginal discoloration, marginal integrity, and surface texture between the baseline and five-year recall for both systems (p<0.001), with an increased percentage of Bravo scores. However, few Charlie or Delta scores were attributed to these restorations. In conclusion, these two types of ceramic materials demonstrated acceptable clinical performance after five years
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.
Resumo:
The evolution of the electronics embedded applications forces electronics systems designers to match their ever increasing requirements. This evolution pushes the computational power of digital signal processing systems, as well as the energy required to accomplish the computations, due to the increasing mobility of such applications. Current approaches used to match these requirements relies on the adoption of application specific signal processors. Such kind of devices exploits powerful accelerators, which are able to match both performance and energy requirements. On the other hand, the too high specificity of such accelerators often results in a lack of flexibility which affects non-recurrent engineering costs, time to market, and market volumes too. The state of the art mainly proposes two solutions to overcome these issues with the ambition of delivering reasonable performance and energy efficiency: reconfigurable computing and multi-processors computing. All of these solutions benefits from the post-fabrication programmability, that definitively results in an increased flexibility. Nevertheless, the gap between these approaches and dedicated hardware is still too high for many application domains, especially when targeting the mobile world. In this scenario, flexible and energy efficient acceleration can be achieved by merging these two computational paradigms, in order to address all the above introduced constraints. This thesis focuses on the exploration of the design and application spectrum of reconfigurable computing, exploited as application specific accelerators for multi-processors systems on chip. More specifically, it introduces a reconfigurable digital signal processor featuring a heterogeneous set of reconfigurable engines, and a homogeneous multi-core system, exploiting three different flavours of reconfigurable and mask-programmable technologies as implementation platform for applications specific accelerators. In this work, the various trade-offs concerning the utilization multi-core platforms and the different configuration technologies are explored, characterizing the design space of the proposed approach in terms of programmability, performance, energy efficiency and manufacturing costs.
Resumo:
Investigating parents’ formal engagement opportunities in public schools serves well to characterize the relationship between states and societies. While the relationship between parental involvement and students’ academic success has been thoroughly investigated, rarely has it been seen to indicate countries’ governing regimes. The researcher was curious to see whether and how does parents’ voice differ in different democracies. The hypothesis was that in mature regimes, institutional opportunities for formal parental engagement are plenty and parents are actively involved; while in young democracies there are less opportunities and the engagement is lower. The assumption was also that parental deliberation in expressing their dissatisfaction with schools differs across democracies: where it is more intense, there it translates to higher engagement. Parents’ informedness on relevant regulations and agendas was assumed to be equally average, and their demographic background to have similar effects on engagement. The comparative, most different systems design was employed where public middle schools last graders’ parents in Tartu, Estonia and in Huntsville, Alabama the United States served as a sample. The multidimensional study includes the theoretical review, country and community analyses, institutional analysis in terms of formal parental involvement, and parents’ survey. The findings revealed sizeable differences between parents’ engagement levels in Huntsville and Tartu. The results indicate passivity in both communities, while in Tartu the engagement seems to be alarmingly low. Furthermore, Tartu parents have much less institutional opportunities to engage. In the United States, multilevel efforts to engage parents are visible from local to federal level, in Estonia similar intentions seem to be missing and meaningful parental organizations do not exist. In terms of civic education there is much room for development in both countries. The road will be longer for a young democracy Estonia in transforming its institutional systems from formally democratic to inherently inclusive.
Resumo:
The presented thesis revolves around the study of thermally-responsive PNIPAAm-based hydrogels in water/based environments, as studied by Fluorescence Correlation Spectroscopy (FCS).rnThe goal of the project was the engineering of PNIPAAm gels into biosensors. Specifically, a gamma of such gels were both investigated concerning their dynamics and structure at the nanometer scale, and their performance in retaining bound bodies upon thermal collapse (which PNIPAAm undergoes upon heating above 32 ºC).rnFCS’s requirements, as a technique, match the limitations imposed by the system. Namely, the need to intimately probe a system in a solvent, which was also fragile and easy to alter. FCS, on the other hand, both requires a fluid environment to work, and is based on the observation of diffusion of fluorescents at nanomolar concentrations. FCS was applied to probe the hydrogels on the nanometer size with minimal invasivity.rnVariables in the gels were addressed in the project including crosslinking degree; structural changes during thermal collapse; behavior in different buffers; the possibility of decreasing the degree of inhomogeneity; behavior of differently sized probes; and the effectiveness of antibody functionalization upon thermal collapse.rnThe evidenced results included the heightening of structural inhomogeneities during thermal collapse and under different buffer conditions; the use of annealing to decrease the inhomogeneity degree; the use of differently sized probes to address different length scale of the gel; and the successful functionalization before and after collapse.rnThe thesis also addresses two side projects, also carried forward via FCS. One, diffusion in inverse opals, produced a predictive simulation model for diffusion of bodies in confined systems as dependent on the bodies’ size versus the characteristic sizes of the system. The other was the observation of interaction of bodies of opposite charge in a water solution, resulting in a phenomenological theory and an evaluation method for both the average residence time of the different bodies together, and their attachment likelihood.
Resumo:
Agriculture is still important for socio-economic development in rural areas of Bosnia, Montenegro and Serbia (BMS). However, for sustainable rural development rural economies should be diversified so attention should be paid also to off-farm and non-farm income-generating activities. Agricultural and rural development (ARD) processes and farm activity diversification initiatives should be well governed. The ultimate objective of this work is to explore linkages between ARD governance and rural livelihoods diversification in BMS. The thesis is based on an extended secondary data analysis and surveys. Questionnaires for ARD governance and coordination were sent via email to public, civil society and international organizations. Concerning rural livelihood diversification, the field questionnaire surveys were carried out in three rural regions of BMS. Results show that local rural livelihoods are increasingly diversified but a significant share of households are still engaged in agriculture. Diversification strategies have a chance to succeed taking into consideration the three rural regions’ assets. However, rural households have to tackle many problems for developing new income-generating activities such as the lack of financial resources. Weak business skills are also a limiting factor. Fully exploiting rural economy diversification potential in BMS requires many interventions including improving rural governance, enhancing service delivery in rural areas, upgrading rural people’s human capital, strengthening rural social capital and improving physical capital, access of the rural population to finance as well as creating a favourable and enabling legal and legislative environment fostering diversification. Governance and coordination of ARD policy design, implementation and evaluation is still challenging in the three Balkan countries and this has repercussions also on the pace of rural livelihoods diversification. Therefore, there is a strong and urgent need for mobilization of all rural stakeholders and actors through appropriate governance arrangements in order to foster rural livelihoods diversification and quality of life improvement.
Resumo:
Alternans of cardiac action potential duration (APD) is a well-known arrhythmogenic mechanism which results from dynamical instabilities. The propensity to alternans is classically investigated by examining APD restitution and by deriving APD restitution slopes as predictive markers. However, experiments have shown that such markers are not always accurate for the prediction of alternans. Using a mathematical ventricular cell model known to exhibit unstable dynamics of both membrane potential and Ca2+ cycling, we demonstrate that an accurate marker can be obtained by pacing at cycle lengths (CLs) varying randomly around a basic CL (BCL) and by evaluating the transfer function between the time series of CLs and APDs using an autoregressive-moving-average (ARMA) model. The first pole of this transfer function corresponds to the eigenvalue (λalt) of the dominant eigenmode of the cardiac system, which predicts that alternans occurs when λalt≤−1. For different BCLs, control values of λalt were obtained using eigenmode analysis and compared to the first pole of the transfer function estimated using ARMA model fitting in simulations of random pacing protocols. In all versions of the cell model, this pole provided an accurate estimation of λalt. Furthermore, during slow ramp decreases of BCL or simulated drug application, this approach predicted the onset of alternans by extrapolating the time course of the estimated λalt. In conclusion, stochastic pacing and ARMA model identification represents a novel approach to predict alternans without making any assumptions about its ionic mechanisms. It should therefore be applicable experimentally for any type of myocardial cell.