858 resultados para SUPPLY AND INFORMATION NETWORKS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation's critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a 'distance metric'. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Water systems in the Sultanate of Oman are inevitably exposed to varied threats and hazards due to both natural and man-made hazards. Natural disasters, especially tropical cyclone Gonu in 2007, cause immense damage to water supply systems in Oman. At the same time water loss from leaks is a major operational problem. This research developed an integrated approach to identify and rank the risks to the water sources, transmission pipelines and distribution networks in Oman and suggests appropriate mitigation measures. The system resilience was evaluated and an emergency response plan for the water supplies developed. The methodology involved mining the data held by the water supply utility for risk and resilience determination and operational data to support calculations of non-revenue water. Risk factors were identified, ranked and scored at a stakeholder workshop and the operational information required was principally gathered from interviews. Finally, an emergency response plan was developed by evaluating the risk and resilience factors. The risk analysis and assessment used a Coarse Risk Analysis (CRA) approach and risk scores were generated using a simple risk matrix based on WHO recommendations. The likelihoods and consequences of a wide range of hazardous events were identified through a key workshop and subsequent questionnaires. The thesis proposes a method of translating the detailed risk evaluations into resilience scores through a methodology used in transportation networks. A water audit indicated that the percentage of NRW in Oman is greater than 35% which is similar to other Gulf countries but high internationally. The principal strategy for managing NRW used in the research was the AWWA water audit method which includes free to use software and was found to be easy to apply in Oman. The research showed that risks to the main desalination processes can be controlled but the risk due to feed water quality might remain high even after implementing mitigation measures because the intake is close to an oil port with a significant risk of oil contamination and algal blooms. The most severe risks to transmission mains were found to be associated with pipe rather than pump failure. The systems in Oman were found to be moderately resilient, the resilience of desalination plants reasonably high but the transmission mains and pumping stations are very vulnerable. The integrated strategy developed in this study has a wide applicability, particularly in the Gulf area, which may have risks from exceptional events and will be experiencing NRW. Other developing countries may also experience such risks but with different magnitudes and the risk evaluation tables could provide a useful format for further work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents a critical analysis of the extant literature pertaining to the networking behaviours of young jobseekers in both offline and online environments. A framework derived from information behaviour theory is proposed as a basis for conducting further research in this area. Method. Relevant material for the review was sourced from key research domains such as library and information science, job search research, and organisational research. Analysis. Three key research themes emerged from the analysis of the literature: (1) social networks, and the use of informal channels of information during job search, (2) the role of networking behaviours in job search, and (3) the adoption of social media tools. Tom Wilson’s general model of information behaviour was also identified as a suitable framework to conduct further research. Results. Social networks have a crucial informational utility during the job search process. However, the processes whereby young jobseekers engage in networking behaviours, both offline and online, remain largely unexplored. Conclusion. Identification and analysis of the key research themes reveal opportunities to acquire further knowledge regarding the networking behaviours of young jobseekers. Wilson’s model can be used as a framework to provide a holistic understanding of the networking process, from an information behaviour perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.

The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.

The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).

The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.

The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.

In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Paper prepared by Marion Panizzon and Charlotte Sieber-Gasser for the International Conference on the Political Economy of Liberalising Trade in Services, Hebrew University of Jerusalem, 14-15 June 2010 Recent literature has shed light on the economic potential of cross-border networks. These networks, consisting of expatriates and their acquaintances from abroad and at home, provide the basis for the creation of cross-border value added chains and therewith the means for turning brain drain into brain circulation. Both aspects are potentially valuable for economic growth in the developing world. Unilateral co-development policies operating through co-funding of expatriate business ventures, but also bilateral agreements liberalising circular migration for a limited set of per-sons testify to the increasing awareness of governments about the potential, which expatriate networks hold for economic growth in developing countries. Whereas such punctual efforts are valuable, viewed from a long term perspective, these top-down, government mandated Diaspora stimulation programs, will not replace, this paper argues, the market-driven liberalisation of infrastructure and other services in developing countries. Nor will they carry, in the case of circular labour migration, the political momentum to liberalise labour market admission for those non-nationals, who will eventually emerge as the future transnational entrepreneurs. It will take a combination of mode 4 and infrastructure services openings-cum regulation for countries at both sides of the spectrum to provide the basis and precondition for transnational business and entrepreneurial networks to emerge and translate into cross-border, value added production chains. Two key issues are of particular relevance in this context: (i) the services sector, especially in infrastructure, tends to suffer from inefficiencies, particularly in developing countries, and (ii) labour migration, a highly complex issue, still faces disproportionately rigid barriers despite well-documented global welfare gains. Both are hindrances for emerging markets to fully take advantage of the potential of these cross-border networks. Adapting the legal framework for enhancing the regulatory and institutional frameworks for services trade, especially in infrastructure services sectors (ISS) and labour migration could provide the incentives necessary for brain circulation and strengthen cross-border value added chains by lowering transaction costs. This paper analyses the shortfalls of the global legal framework – the shallow status quo of GATS commitments in ISS and mode 4 particular – in relation to stimulating brain circulation and the creation of cross-border value added chains in emerging markets. It highlights the necessity of adapting the legal framework, both on the global and the regional level, to stimulate broader and wider market access in the four key ISS sectors (telecommunications, transport, professional and financial services) in developing countries, as domestic supply capacity, global competitiveness and economic diversification in ISS sectors are necessary for mobilising expatriate re-turns, both physical and virtual. The paper argues that industrialised, labour receiving countries need to offer mode 4 market access to wider categories of persons, especially to students, graduate trainees and young professionals from abroad. Further-more, free trade in semi-finished products and mode 4 market access are crucial for the creation of cross-border value added chains across the developing world. Finally, the paper discusses on the basis of a case study on Jordan why the key features of trade agreements, which promote circular migration and the creation of cross-border value added chains, consist of trade liberalisation in services and liberal migration policies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional web search engines are centralised in that a single entity crawls and indexes the documents selected for future retrieval, and the relevance models used to determine which documents are relevant to a given user query. As a result, these search engines suffer from several technical drawbacks such as handling scale, timeliness and reliability, in addition to ethical concerns such as commercial manipulation and information censorship. Alleviating the need to rely entirely on a single entity, Peer-to-Peer (P2P) Information Retrieval (IR) has been proposed as a solution, as it distributes the functional components of a web search engine – from crawling and indexing documents, to query processing – across the network of users (or, peers) who use the search engine. This strategy for constructing an IR system poses several efficiency and effectiveness challenges which have been identified in past work. Accordingly, this thesis makes several contributions towards advancing the state of the art in P2P-IR effectiveness by improving the query processing and relevance scoring aspects of a P2P web search. Federated search systems are a form of distributed information retrieval model that route the user’s information need, formulated as a query, to distributed resources and merge the retrieved result lists into a final list. P2P-IR networks are one form of federated search in routing queries and merging result among participating peers. The query is propagated through disseminated nodes to hit the peers that are most likely to contain relevant documents, then the retrieved result lists are merged at different points along the path from the relevant peers to the query initializer (or namely, customer). However, query routing in P2P-IR networks is considered as one of the major challenges and critical part in P2P-IR networks; as the relevant peers might be lost in low-quality peer selection while executing the query routing, and inevitably lead to less effective retrieval results. This motivates this thesis to study and propose query routing techniques to improve retrieval quality in such networks. Cluster-based semi-structured P2P-IR networks exploit the cluster hypothesis to organise the peers into similar semantic clusters where each such semantic cluster is managed by super-peers. In this thesis, I construct three semi-structured P2P-IR models and examine their retrieval effectiveness. I also leverage the cluster centroids at the super-peer level as content representations gathered from cooperative peers to propose a query routing approach called Inverted PeerCluster Index (IPI) that simulates the conventional inverted index of the centralised corpus to organise the statistics of peers’ terms. The results show a competitive retrieval quality in comparison to baseline approaches. Furthermore, I study the applicability of using the conventional Information Retrieval models as peer selection approaches where each peer can be considered as a big document of documents. The experimental evaluation shows comparative and significant results and explains that document retrieval methods are very effective for peer selection that brings back the analogy between documents and peers. Additionally, Learning to Rank (LtR) algorithms are exploited to build a learned classifier for peer ranking at the super-peer level. The experiments show significant results with state-of-the-art resource selection methods and competitive results to corresponding classification-based approaches. Finally, I propose reputation-based query routing approaches that exploit the idea of providing feedback on a specific item in the social community networks and manage it for future decision-making. The system monitors users’ behaviours when they click or download documents from the final ranked list as implicit feedback and mines the given information to build a reputation-based data structure. The data structure is used to score peers and then rank them for query routing. I conduct a set of experiments to cover various scenarios including noisy feedback information (i.e, providing positive feedback on non-relevant documents) to examine the robustness of reputation-based approaches. The empirical evaluation shows significant results in almost all measurement metrics with approximate improvement more than 56% compared to baseline approaches. Thus, based on the results, if one were to choose one technique, reputation-based approaches are clearly the natural choices which also can be deployed on any P2P network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nelore is the major beef cattle breed in Brazil with more than 130 million heads. Genome-wide association studies (GWAS) are often used to associate markers and genomic regions to growth and meat quality traits that can be used to assist selection programs. An alternative methodology to traditional GWAS that involves the construction of gene network interactions, derived from results of several GWAS is the AWM (Association Weight Matrices)/PCIT (Partial Correlation and Information Theory). With the aim of evaluating the genetic architecture of Brazilian Nelore cattle, we used high-density SNP genotyping data (~770,000 SNP) from 780 Nelore animals comprising 34 half-sibling families derived from highly disseminated and unrelated sires from across Brazil. The AWM/PCIT methodology was employed to evaluate the genes that participate in a series of eight phenotypes related to growth and meat quality obtained from this Nelore sample.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a joint location-inventory model is proposed that simultaneously optimises strategic supply chain design decisions such as facility location and customer allocation to facilities, and tactical-operational inventory management and production scheduling decisions. All this is analysed in a context of demand uncertainty and supply uncertainty. While demand uncertainty stems from potential fluctuations in customer demands over time, supply-side uncertainty is associated with the risk of “disruption” to which facilities may be subject. The latter is caused by external factors such as natural disasters, strikes, changes of ownership and information technology security incidents. The proposed model is formulated as a non-linear mixed integer programming problem to minimise the expected total cost, which includes four basic cost items: the fixed cost of locating facilities at candidate sites, the cost of transport from facilities to customers, the cost of working inventory, and the cost of safety stock. Next, since the optimisation problem is very complex and the number of evaluable instances is very low, a "matheuristic" solution is presented. This approach has a twofold objective: on the one hand, it considers a larger number of facilities and customers within the network in order to reproduce a supply chain configuration that more closely reflects a real-world context; on the other hand, it serves to generate a starting solution and perform a series of iterations to try to improve it. Thanks to this algorithm, it was possible to obtain a solution characterised by a lower total system cost than that observed for the initial solution. The study concludes with some reflections and the description of possible future insights.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information on fruits and vegetables consumption in Brazil in the three levels of dietary data was analyzed and compared. Data about national supply came from Food Balance Sheets compiled by the FAO; household availability information was obtained from the Brazilian National Household Budget Survey (HBS); and actual intake information came from a large individual dietary intake survey that was representative of the adult population of São Paulo city. All sources of information were collected between 2002 and 2003. A subset of the HBS, representative of São Paulo city, was used in our analysis in order to improve the quality of the comparison with actual intake data. The ratio of national supply to household availability of fruits and vegetables was 2.6 while the ratio of national supply to actual intake was 4.0. The discrepancy ratio in the comparison between household availability and actual intake was smaller, 1.6. While the use of supply and availability data has advantages, as lower cost, must be taken into account that these sources tend to overestimate actual intake of fruits and vegetables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mutualistic networks are crucial to the maintenance of ecosystem services. Unfortunately, what we know about seed dispersal networks is based only on bird-fruit interactions. Therefore, we aimed at filling part of this gap by investigating bat-fruit networks. It is known from population studies that: (i) some bat species depend more on fruits than others, and (ii) that some specialized frugivorous bats prefer particular plant genera. We tested whether those preferences affected the structure and robustness of the whole network and the functional roles of species. Nine bat-fruit datasets from the literature were analyzed and all networks showed lower complementary specialization (H(2)' = 0.3760.10, mean 6 SD) and similar nestedness (NODF = 0.5660.12) than pollination networks. All networks were modular (M=0.32 +/- 0.07), and had on average four cohesive subgroups (modules) of tightly connected bats and plants. The composition of those modules followed the genus-genus associations observed at population level (Artibeus-Ficus, Carollia-Piper, and Sturnira-Solanum), although a few of those plant genera were dispersed also by other bats. Bat-fruit networks showed high robustness to simulated cumulative removals of both bats (R = 0.55 +/- 0.10) and plants (R = 0.68 +/- 0.09). Primary frugivores interacted with a larger proportion of the plants available and also occupied more central positions; furthermore, their extinction caused larger changes in network structure. We conclude that bat-fruit networks are highly cohesive and robust mutualistic systems, in which redundancy is high within modules, although modules are complementary to each other. Dietary specialization seems to be an important structuring factor that affects the topology, the guild structure and functional roles in bat-fruit networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thanks to recent advances in molecular biology, allied to an ever increasing amount of experimental data, the functional state of thousands of genes can now be extracted simultaneously by using methods such as cDNA microarrays and RNA-Seq. Particularly important related investigations are the modeling and identification of gene regulatory networks from expression data sets. Such a knowledge is fundamental for many applications, such as disease treatment, therapeutic intervention strategies and drugs design, as well as for planning high-throughput new experiments. Methods have been developed for gene networks modeling and identification from expression profiles. However, an important open problem regards how to validate such approaches and its results. This work presents an objective approach for validation of gene network modeling and identification which comprises the following three main aspects: (1) Artificial Gene Networks (AGNs) model generation through theoretical models of complex networks, which is used to simulate temporal expression data; (2) a computational method for gene network identification from the simulated data, which is founded on a feature selection approach where a target gene is fixed and the expression profile is observed for all other genes in order to identify a relevant subset of predictors; and (3) validation of the identified AGN-based network through comparison with the original network. The proposed framework allows several types of AGNs to be generated and used in order to simulate temporal expression data. The results of the network identification method can then be compared to the original network in order to estimate its properties and accuracy. Some of the most important theoretical models of complex networks have been assessed: the uniformly-random Erdos-Renyi (ER), the small-world Watts-Strogatz (WS), the scale-free Barabasi-Albert (BA), and geographical networks (GG). The experimental results indicate that the inference method was sensitive to average degree k variation, decreasing its network recovery rate with the increase of k. The signal size was important for the inference method to get better accuracy in the network identification rate, presenting very good results with small expression profiles. However, the adopted inference method was not sensible to recognize distinct structures of interaction among genes, presenting a similar behavior when applied to different network topologies. In summary, the proposed framework, though simple, was adequate for the validation of the inferred networks by identifying some properties of the evaluated method, which can be extended to other inference methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Worldwide distribution of surgical interventions is unequal. Developed countries account for the majority of surgeries and information about non-cardiac operations in developing countries is scarce. The purpose of our study was to describe the epidemiological data of non-cardiac surgeries performed in Brazil in the last years. Methods and Findings: This is a retrospective cohort study that investigated the time window from 1995 to 2007. We collected information from DATASUS, a national public health system database. The following variables were studied: number of surgeries, in-hospital expenses, blood transfusion related costs, length of stay and case fatality rates. The results were presented as sum, average and percentage. The trend analysis was performed by linear regression model. There were 32,659,513 non-cardiac surgeries performed in Brazil in thirteen years. An increment of 20.42% was observed in the number of surgeries in this period and nowadays nearly 3 million operations are performed annually. The cost of these procedures has increased tremendously in the last years. The increment of surgical cost was almost 200%. The total expenses related to surgical hospitalizations were more than $10 billion in all these years. The yearly cost of surgical procedures to public health system was more than $1.27 billion for all surgical hospitalizations, and in average, U$445.24 per surgical procedure. The total cost of blood transfusion was near $98 million in all years and annually approximately $10 million were spent in perioperative transfusion. The surgical mortality had an increment of 31.11% in the period. Actually, in 2007, the surgical mortality in Brazil was 1.77%. All the variables had a significant increment along the studied period: r square (r(2)) = 0.447 for the number of surgeries (P = 0.012), r(2) = 0.439 for in-hospital expenses (P = 0.014) and r(2) = 0.907 for surgical mortality (P = 0.0055). Conclusion: The volume of surgical procedures has increased substantially in Brazil through the past years. The expenditure related to these procedures and its mortality has also increased as the number of operations. Better planning of public health resource and strategies of investment are needed to supply the crescent demand of surgery in Brazil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article discusses issues related to the organization and reception of information in the context of services and public information systems driven by technology. It stems from the assumption that in a ""technologized"" society, the distance between users and information is almost always of cognitive and socio-cultural nature, a product of our effort to design communication. In this context, we favor the approach of the information sign, seeking to answer how a documentary message turns into information, i.e. a structure recognized as socially useful. Observing the structural, cognitive and communicative aspects of the documentary message, based on Documentary Linguistics, Terminology, as well as on Textual Linguistics, the policy of knowledge management and innovation of the Government of the State of Sao Paulo is analyzed, which authorizes the use of Web 2.0, also questioning to what extent this initiative represents innovation in the environment of libraries.