838 resultados para end to side anastomosis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim. We report a case of ulnar and palmar arch artery aneurysm in a 77 years old man without history of any occupational or recreational trauma, vasculitis, infections or congenital anatomic abnormalities. We also performed a computed search of literature in PUBMED using the keywords “ulnar artery aneurysm” and “palmar arch aneurysm”. Case report. A 77 years old male patient was admitted to hospital with a pulsing mass at distal right ulnar artery and deep palmar arch; at ultrasound and CT examination a saccular aneurysm of 35 millimeters at right ulnar artery and a 15 millimeters dilatation at deep palmar arch were detected. He was asymptomatic for distal embolization and pain. In local anesthesia ulnar artery and deep palmar arch dilatations were resected. Reconstruction of vessels was performed through an end-to-end microvascular repair. Histological examination confirmed the absence of vasculitis and collagenopaties. In postoperative period there were no clinical signs of peripheral ischemia, Allen’s test and ultrasound examination were normal. At follow-up of six months, the patient was still asymptomatic with a normal Allen test, no signs of distal digital ischemia and patency of treated vessel with normal flow at duplex ultrasound. Conclusion. True spontaneous aneurysms of ulnar artery and palmar arch are rare and can be successfully treated with resection and microvascular reconstruction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

No presente trabalho procede-se a uma descrição sobre os aspetos considerados mais relevantes relativos à vida e à obra do extraordinário cientista e humanista do século XIX – Louis Pasteur. Contextualiza-se, também, em vários pontos, a sua obra com a de outros cientistas da época, enquadrando os trabalhos por si realizados em descobertas anteriores. Abordam-se os magníficos estudos feitos no âmbito da cristalografia (que muito contribuíram para a moderna estereoquímica) e da fermentação, como um mecanismo utilizado por certos microrganismos para produzir energia na ausência de oxigénio, facto totalmente inédito na época. Explica-se como Pasteur, de uma forma inteligentíssima, conseguiu pôr fim à velha teoria da geração espontânea. Refere-se como surgiu a genial ideia da pasteurização, termo em homenagem ao grande sábio, que veio a modificar toda a indústria do vinho, da cerveja e de tantos outros alimentos, estabelecendo a importância da microbiologia na indústria alimentar. Abordam-se os estudos realizados por Pasteur sobre doenças infeciosas (a pebrina, a cólera das galinhas, o carbúnculo e a raiva), incluindo os espetaculares procedimentos que conduziram à elaboração das primeiras vacinas que ensinaram aos cientistas mais novos a fabricar outras, salvando-se tantas vidas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report a case of a 24-year-old woman who was delivered via cesarean section at 39 weeks and presented in the puerperium with symptoms of worsening abdominal pain and septicaemia. Preoperative ultrasonography suggested the presence of a pelvic collection. Explorative laparotomy revealed the simultaneous presence of Meckel's diverticulitis and appendicitis without bowel perforation. The patient made an uneventful recovery following small bowel resection with end to end reanastomosis and appendicectomy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While news stories are an important traditional medium to broadcast and consume news, microblogging has recently emerged as a place where people can dis- cuss, disseminate, collect or report information about news. However, the massive information in the microblogosphere makes it hard for readers to keep up with these real-time updates. This is especially a problem when it comes to breaking news, where people are more eager to know “what is happening”. Therefore, this dis- sertation is intended as an exploratory effort to investigate computational methods to augment human effort when monitoring the development of breaking news on a given topic from a microblog stream by extractively summarizing the updates in a timely manner. More specifically, given an interest in a topic, either entered as a query or presented as an initial news report, a microblog temporal summarization system is proposed to filter microblog posts from a stream with three primary concerns: topical relevance, novelty, and salience. Considering the relatively high arrival rate of microblog streams, a cascade framework consisting of three stages is proposed to progressively reduce quantity of posts. For each step in the cascade, this dissertation studies methods that improve over current baselines. In the relevance filtering stage, query and document expansion techniques are applied to mitigate sparsity and vocabulary mismatch issues. The use of word embedding as a basis for filtering is also explored, using unsupervised and supervised modeling to characterize lexical and semantic similarity. In the novelty filtering stage, several statistical ways of characterizing novelty are investigated and ensemble learning techniques are used to integrate results from these diverse techniques. These results are compared with a baseline clustering approach using both standard and delay-discounted measures. In the salience filtering stage, because of the real-time prediction requirement a method of learning verb phrase usage from past relevant news reports is used in conjunction with some standard measures for characterizing writing quality. Following a Cranfield-like evaluation paradigm, this dissertation includes a se- ries of experiments to evaluate the proposed methods for each step, and for the end- to-end system. New microblog novelty and salience judgments are created, building on existing relevance judgments from the TREC Microblog track. The results point to future research directions at the intersection of social media, computational jour- nalism, information retrieval, automatic summarization, and machine learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magnetically-induced forces on the inertial masses on-board LISA Path finder are expected to be one of the dominant contributions to the mission noise budget, accounting for up to 40%. The origin of this disturbance is the coupling of the residual magnetization and susceptibility of the test masses with the environmental magnetic field. In order to fully understand this important part of the noise model, a set of coils and magnetometers are integrated as a part of the diagnostics subsystem. During operations a sequence of magnetic excitations will be applied to precisely determine the coupling of the magnetic environment to the test mass displacement using the on-board magnetometers. Since no direct measurement of the magnetic field in the test mass position will be available, an extrapolation of the magnetic measurements to the test mass position will be carried out as a part of the data analysis activities. In this paper we show the first results on the magnetic experiments during an end-to-end LISA Path finder simulation, and we describe the methods under development to map the magnetic field on-board.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermal Diagnostics experiments to be carried out on board LISA Pathfinder (LPF) will yield a detailed characterisation of how temperature fluctuations affect the LTP (LISA Technology Package) instrument performance, a crucial information for future space based gravitational wave detectors as the proposed eLISA. Amongst them, the study of temperature gradient fluctuations around the test masses of the Inertial Sensors will provide as well information regarding the contribution of the Brownian noise, which is expected to limit the LTP sensitivity at frequencies close to 1mHz during some LTP experiments. In this paper we report on how these kind of Thermal Diagnostics experiments were simulated in the last LPF Simulation Campaign (November, 2013) involving all the LPF Data Analysis team and using an end-to-end simulator of the whole spacecraft. Such simulation campaign was conducted under the framework of the preparation for LPF operations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to survey and assess the state-of-the-art in automatic target recognition for synthetic aperture radar imagery (SAR-ATR). The aim is not to develop an exhaustive survey of the voluminous literature, but rather to capture in one place the various approaches for implementing the SAR-ATR system. This paper is meant to be as self-contained as possible, and it approaches the SAR-ATR problem from a holistic end-to-end perspective. A brief overview for the breadth of the SAR-ATR challenges is conducted. This is couched in terms of a single-channel SAR, and it is extendable to multi-channel SAR systems. Stages pertinent to the basic SAR-ATR system structure are defined, and the motivations of the requirements and constraints on the system constituents are addressed. For each stage in the SAR-ATR processing chain, a taxonomization methodology for surveying the numerous methods published in the open literature is proposed. Carefully selected works from the literature are presented under the taxa proposed. Novel comparisons, discussions, and comments are pinpointed throughout this paper. A two-fold benchmarking scheme for evaluating existing SAR-ATR systems and motivating new system designs is proposed. The scheme is applied to the works surveyed in this paper. Finally, a discussion is presented in which various interrelated issues, such as standard operating conditions, extended operating conditions, and target-model design, are addressed. This paper is a contribution toward fulfilling an objective of end-to-end SAR-ATR system design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last couple of decades have been the stage for the introduction of new telecommunication networks. It is expected that in the future all types of vehicles, such as cars, buses and trucks have the ability to intercommunicate and form a vehicular network. Vehicular networks display particularities when compared to other networks due to their continuous node mobility and their wide geographical dispersion, leading to a permanent network fragmentation. Therefore, the main challenges that this type of network entails relate to the intermittent connectivity and the long and variable delay in information delivery. To address the problems related to the intermittent connectivity, a new concept was introduced – Delay Tolerant Network (DTN). This architecture is built on a Store-Carry-and-Forward (SCF) mechanism in order to assure the delivery of information when there is no end-to-end path defined. Vehicular networks support a multiplicity of services, including the transportation of non-urgent information. Therefore, it is possible to conclude that the use of a DTN for the dissemination of non-urgent information is able to surpass the aforementioned challenges. The work developed focused on the use of DTNs for the dissemination of non-urgent information. This information is originated in the network service provider and should be available on mobile network terminals during a limited period of time. In order to do so, four different strategies were deployed: Random, Least Number of Hops First (LNHF), Local Rarest Bundle First (LRBF) e Local Rarest Generation First (LRGF). All of these strategies have a common goal: to disseminate content into the network in the shortest period of time and minimizing network congestion. This work also contemplates the analysis and implementation of techniques that reduce network congestion. The design, implementation and validation of the proposed strategies was divided into three stages. The first stage focused on creating a Matlab emulator for the fast implementation and strategy validation. This stage resulted in the four strategies that were afterwards implemented in the DTNs software Helix – developed in a partnership between Instituto de Telecomunicac¸˜oes (IT) and Veniam R , which are responsible for the largest operating vehicular network worldwide that is located in Oporto city. The strategies were later evaluated on an emulator that was built for the largescale testing of DTN. Both emulators account for vehicular mobility based on information previously collected from the real platform. Finally, the strategy that presented the best overall performance was tested on a real platform – in a lab environment – for concept and operability demonstration. It is possible to conclude that two of the implemented strategies (LRBF and LRGF) can be deployed in the real network and guarantee a significant delivery rate. The LRBF strategy has the best performance in terms of delivery. However, it needs to add a significant overhead to the network in order to work. In the future, tests of scalability should be conducted in a real environment in order to confirm the emulator results. The real implementation of the strategies should be accompanied by the introduction of new types of services for content distribution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Einleitung: Notwendige Voraussetzung für die Entstehung von Zervixkarzinomen ist eine persistierende Infektion mit humanen Papillomaviren (HPV). Die HPV-Typen 16 und 18 verursachen mit etwa 70% den überwiegenden Teil der Zervixkarzinome. Seit 2006/2007 stehen zwei Impfstoffe gegen HPV 16 und 18 zur Verfügung. Fragestellung: Wie effektiv ist die HPV-Impfung hinsichtlich der Reduktion von Zervixkarzinomen bzw. ihren Vorstufen (CIN)? Stellt die HPV-Impfung eine kosteneffektive Ergänzung zur derzeitigen Screeningpraxis dar? Gibt es Unterschiede bezüglich der Kosten-Effektivität zwischen den beiden verfügbaren Impfstoffen? Sollte aus gesundheitsökonomischer Perspektive eine Empfehlung für den Einsatz der HPV-Impfung gegeben werden? Falls ja, welche Empfehlungen bezüglich der Ausgestaltung einer Impfstrategie lassen sich ableiten? Welche ethischen, sozialen und juristischen Implikationen sind zu berücksichtigen? Methoden: Basierend auf einer systematischen Literaturrecherche werden randomisierte kontrollierte Studien zur Wirksamkeit der HPV-Impfungen für die Prävention von Zervixkarzinomen bzw. deren Vorstufen, den zervikalen intraepithelialen Neoplasien, identifiziert. Gesundheitsökonomische Modellierungen werden zur Beantwortung der ökonomischen Fragestellungen herangezogen. Die Beurteilung der Qualität der medizinischen und ökonomischen Studien erfolgt mittels anerkannter Standards zur systematischen Bewertung wissenschaftlicher Studien Ergebnisse: Bei zu Studienbeginn HPV 16/18 negativen Frauen, die alle Impfdosen erhalten haben, liegt die Wirksamkeit der Impfungen gegen HPV 16/18-induzierten CIN 2 oder höher bei 98% bis 100%. Nebenwirkungen der Impfung sind vor allem mit der Injektion assoziierte Beschwerden (Rötungen, Schwellungen, Schmerzen). Es gibt keine signifikanten Unterschiede für schwerwiegende unerwünschte Ereignisse zwischen Impf- und Placebogruppe. Die Ergebnisse der Basisfallanalysen der gesundheitsökonomischen Modellierungen reichen bei ausschließlicher Berücksichtigung direkter Kostenkomponenten von ca. 3.000 Euro bis ca. 40.000 Euro pro QALY (QALY = Qualitätskorrigiertes Lebensjahr), bzw. von ca. 9.000 Euro bis ca. 65.000 Euro pro LYG (LYG = Gewonnenes Lebensjahr). Diskussion: Nach den Ergebnissen der eingeschlossenen Studien sind die verfügbaren HPV-Impfstoffe wirksam zur Prävention gegen durch HPV 16/18 verursachte prämaligne Läsionen der Zervix. Unklar ist derzeit noch die Dauer des Impfschutzes. Hinsichtlich der Nebenwirkungen ist die Impfung als sicher einzustufen. Allerdings ist die Fallzahl der Studien nicht ausreichend groß, um das Auftreten sehr seltener Nebenwirkungen zuverlässig zu bestimmen. Inwieweit die HPV-Impfung zur Reduktion der Inzidenz und Mortalität des Zervixkarzinoms in Deutschland führen wird, hängt nicht allein von der klinischen Wirksamkeit der Impfstoffe ab, sondern wird von einer Reihe weiterer Faktoren wie der Impfquote oder den Auswirkungen der Impfungen auf die Teilnahmerate an den bestehenden Screeningprogrammen determiniert. Infolge der Heterogenität der methodischen Rahmenbedingungen und Inputparameter variieren die Ergebnisse der gesundheitsökonomischen Modellierungen erheblich. Fast alle Modellanalysen lassen jedoch den Schluss zu, dass die Einführung einer Impfung mit lebenslanger Schutzdauer bei Fortführung der derzeitigen Screeningpraxis als kosteneffektiv zu bewerten ist. Eine Gegenüberstellung der beiden verschiedenen Impfstoffe ergab, dass die Modellierung der tetravalenten Impfung bei der Berücksichtigung von QALY als Ergebnisparameter in der Regel mit einem niedrigeren (besseren) Kosten-Effektivitäts-Verhältnis einhergeht als die Modellierung der bivalenten Impfung, da auch Genitalwarzen berücksichtigt werden. In Sensitivitätsanalysen stellten sich sowohl die Schutzdauer der Impfung als auch die Höhe der Diskontierungsrate als wesentliche Einflussparameter der Kosten-Effektivität heraus. Schlussfolgerung: Die Einführung der HPV-Impfung kann zu einem verringerten Auftreten von Zervixkarzinomen bei geimpften Frauen führen. Jedoch sollten die Impfprogramme von weiteren Evaluationen begleitet werden, um die langfristige Wirksamkeit und Sicherheit beurteilen sowie die Umsetzung der Impfprogramme optimieren zu können. Von zentraler Bedeutung sind hohe Teilnahmeraten sowohl an den Impfprogrammen als auch - auch bei geimpften Frauen - an den Früherkennungsuntersuchungen. Da die Kosten-Effektivität entscheidend von der Schutzdauer, die bislang ungewiss ist, beeinflusst wird, ist eine abschließende Beurteilung der Kosten-Effektivität der HPV-Impfung nicht möglich. Eine langfristige Schutzdauer ist eine bedeutende Vorraussetzung für die Kosten-Effektivität der Impfung. Der Abschluss einer Risk-Sharing-Vereinbarung zwischen Kostenträgern und Herstellerfirmen stellt eine Option dar, um die Auswirkungen der Unsicherheit der Schutzdauer auf die Kosten-Effektivität zu begrenzen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este trabajo de fin de grado se ha desarrollado una aplicación de administración que sustituye a las que ofrecen por defecto las aplicaciones creadas con el framework de desarrollo web Django. La aplicación está compuesta por dos partes: un servidor, desarrollado con Node y Express, que ataca a la base de datos MySQL de la aplicación Django (es el nexo de unión entre ambas), y expone una API que es utilizada por la otra parte que compone la aplicación, la parte del cliente. La API es totalmente privada, siendo necesario un token de autenticación válido para poder obtener una respuesta satisfactoria de la misma. La generación del token también es tarea del servidor. El cliente, que es la parte que ve el usuario final, está desarrollada usando el framework Angular. La interfaz de usuario utiliza Bootstrap, por lo que su visualización es correcta en cualquier tipo de dispositivo, tanto de escritorio como móvil. En definitiva, se ha desarrollado una aplicación JavaScript End-to-End, empleando las últimas tecnologías web, mejorando ostensiblemente, las prestaciones que ofrece un panel de administración generado automáticamente por una aplicación Django.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resumo O estudo realizado no âmbito deste relatório centra-se numa análise do papel do turista na conservação do património natural e cultural, em particular do Parque Nacional da Tijuca, no Rio de Janeiro. Os objetivos definidos para este trabalho são dilucidar o conceito de turismo cultural nas suas inter-relações com o ecoturismo; compreender o estado da arte em relação à intervenção do terceiro setor (ONG associações, fundações) e do turista particular na conservação do património; caracterizar o Parque Nacional da Tijuca; caracterizar a Amigos do Parque; e por fim indagar sobre a predisposição do turista para se associar à Associação dos Amigos do Parque Nacional da Tijuca. Para isso foi concebido um inquérito, realizado pela estagiária aos turistas que visitavam o parque, fazendo assim uma recolha de dados que serviram de base para as conclusões deste trabalho, levando no fim à proposta de algumas estratégias e diretivas a seguir para atrair novos associados, em particular turistas. Os resultados levam a concluir que no geral o turista não está predisposto a contribuir para a conservação e associar-se à Amigos do Parque. A percentagem de turistas que afirmaram estar predispostos a associarem-se foi de apenas 5%, sendo maioritariamente de turistas com idades entre os 25 e os 44 anos, com formação superior e provenientes dos países da América do Sul. O turista foi também questionado quanto à predisposição para fazer uma doação pontual, para a qual a taxa de respostas positiva foi mais elevada com 22%, com a contribuição também de turistas europeus. Apesar destes valores, o número de visitantes anuais do Parque Nacional da Tijuca aumenta a cada ano (prevendo-se que continue a aumentar), sendo que em 2014 foi de 3.086.207 de pessoas, e destes, cerca de 72% foram turistas. É importante conseguir o apoio destes turistas através da associação e através de doação, já que é relevante não só a nível financeiro, como também a nível de internacionalização e reconhecimento da associação e do parque, o que levará a uma maior rede de associados. Palavras-chave: ecoturismo; terceiro setor; conservação; Associação dos Amigos do Parque Nacional da Tijuca vii Abstract The study in this report focuses on an analysis of the tourist role in the conservation of natural and cultural heritage, in particular of the Tijuca National Park in Rio de Janeiro. The goals set for this work are to elucidate the concept of cultural tourism in its interrelations with ecotourism; to understand the state of the art concerning the third sector (NGO associations, foundations) and private tourist in heritage conservation; to characterize the Tijuca National Park; to characterize the Friends of the Park; and finally to inquire about the willingness of the tourist to join the Association of Friends of the Tijuca National Park. For this a survey was designed and conducted by the intern to the tourists visiting the park, thus allowing for a collection of data that formed the basis to the conclusions of this work, leading at the end to the proposal of some strategies and policies to attract new members, particularly tourists. The results lead to the conclusion that as a rule the tourist is not willing to contribute to the conservation and join the Friends of the Park. The percentage of tourists who assumed they were willing to join the association was only 5%, and mostly tourists aged 25 and 44, with higher education, and from the South American countries. Tourists were also asked about their willingness for giving donations, for which the rate of positive responses was higher 22%, with the contribution of European tourists. Despite these numbers, the number of annual visitors to the Tijuca National Park increases every year (and it is expected to continue to increase). In 2014 ir received 3,086,207 visitors, and of these, about 72% were tourists. It is important to get the support of these tourists through association and through donation, since it is relevant not only financially, but also as a tool of the internationalization and recognition of the association and the park, which will lead to a larger network of associates. Keywords: ecotourism; third sector; conservation; Association of Friends of the Tijuca National Park

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The liver is one of the most important organs of human body, being involved in several vital functions and regulation of physiological processes. Given its pivotal role in the excretion of waste metabolites and drugs detoxification, the liver is often subjected to oxidative stress that leads to lipid peroxidation and severe cellular damage. The conventional treatments of liver diseases such as cirrhosis, fatty liver and chronic hepatitis are frequently inadequate due to side effects caused by hepatotoxic chemical drugs. To overcome this problematic paradox, medicinal plants, owing to their natural richness in phenolic compounds, have been intensively exploited concerning their extracts and fraction composition in order to find bioactive compounds that could be isolated and applied in the treatment of liver ailments. The present review aimed to collect the main results of recent studies carried out in this field and systematize the information for a better understanding of the hepatoprotective capacity of medicinal plants in in vitro and in vivo systems. Generally, the assessed plant extracts revealed good hepatoprotective properties, justifying the fractionation and further isolation of phenolic compounds from different parts of the plant. Twenty-five phenolic compounds, including flavonoids, lignan compounds, phenolic acids and other phenolic compounds, have been isolated and identified, and proved to be effective in the prevention and/or treatment of chemically induced liver damage. In this perspective, the use of medicinal plant extracts, fractions and phenolic compounds seems to be a promising strategy to avoid side effects caused by hepatotoxic chemicals.