923 resultados para CRITICAL ORGANS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The autonomic nervous system plays an important role in physiological and pathological conditions, and has been extensively evaluated by parametric and non-parametric spectral analysis. To compare the results obtained with fast Fourier transform (FFT) and the autoregressive (AR) method, we performed a comprehensive comparative study using data from humans and rats during pharmacological blockade (in rats), a postural test (in humans), and in the hypertensive state (in both humans and rats). Although postural hypotension in humans induced an increase in normalized low-frequency (LFnu) of systolic blood pressure, the increase in the ratio was detected only by AR. In rats, AR and FFT analysis did not agree for LFnu and high frequency (HFnu) under basal conditions and after vagal blockade. The increase in the LF/HF ratio of the pulse interval, induced by methylatropine, was detected only by FFT. In hypertensive patients, changes in LF and HF for systolic blood pressure were observed only by AR; FFT was able to detect the reduction in both blood pressure variance and total power. In hypertensive rats, AR presented different values of variance and total power for systolic blood pressure. Moreover, AR and FFT presented discordant results for LF, LFnu, HF, LF/HF ratio, and total power for pulse interval. We provide evidence for disagreement in 23% of the indices of blood pressure and heart rate variability in humans and 67% discordance in rats when these variables are evaluated by AR and FFT under physiological and pathological conditions. The overall disagreement between AR and FFT in this study was 43%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ischemic preconditioning (IPC), a strategy used to attenuate ischemia-reperfusion injury, consists of brief ischemic periods, each followed by reperfusion, prior to a sustained ischemic insult. The purpose of the present study was to evaluate the local and systemic anti-inflammatory effects of hind limb IPC in male Wistar rat (200-250 g) models of acute inflammation. IPC was induced with right hind limb ischemia for 10 min by placing an elastic rubber band tourniquet on the proximal part of the limb followed by 30 min of reperfusion. Groups (N = 6-8) were submitted to right or left paw edema (PE) with carrageenan (100 µg) or Dextran (200 µg), hemorrhagic cystitis with ifosfamide (200 mg/kg, ip) or gastric injury (GI) with indomethacin (20 mg/kg, vo). Controls received similar treatments, without IPC (Sham-IPC). PE is reported as variation of paw volume (mL), vesical edema (VE) as vesical wet weight (mg), vascular permeability (VP) with Evans blue extravasation (µg), GI with the gastric lesion index (GLI; total length of all erosions, mm), and neutrophil migration (NM) from myeloperoxidase activity. The statistical significance (P < 0.05) was determined by ANOVA, followed by the Tukey test. Carrageenan or Dextran-induced PE and VP in either paw were reduced by IPC (42-58.7%). IPC inhibited VE (38.8%) and VP (54%) in ifosfamide-induced hemorrhagic cystitis. GI and NM induced by indomethacin were inhibited by IPC (GLI: 90.3%; NM: 64%). This study shows for the first time that IPC produces local and systemic anti-inflammatory effects in models of acute inflammation other than ischemia-reperfusion injury.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The single photon emission microscope (SPEM) is an instrument developed to obtain high spatial resolution single photon emission computed tomography (SPECT) images of small structures inside the mouse brain. SPEM consists of two independent imaging devices, which combine a multipinhole collimator, a high-resolution, thallium-doped cesium iodide [CsI(Tl)] columnar scintillator, a demagnifying/intensifier tube, and an electron-multiplying charge-coupling device (CCD). Collimators have 300- and 450-µm diameter pinholes on tungsten slabs, in hexagonal arrays of 19 and 7 holes. Projection data are acquired in a photon-counting strategy, where CCD frames are stored at 50 frames per second, with a radius of rotation of 35 mm and magnification factor of one. The image reconstruction software tool is based on the maximum likelihood algorithm. Our aim was to evaluate the spatial resolution and sensitivity attainable with the seven-pinhole imaging device, together with the linearity for quantification on the tomographic images, and to test the instrument in obtaining tomographic images of different mouse organs. A spatial resolution better than 500 µm and a sensitivity of 21.6 counts·s-1·MBq-1 were reached, as well as a correlation coefficient between activity and intensity better than 0.99, when imaging 99mTc sources. Images of the thyroid, heart, lungs, and bones of mice were registered using 99mTc-labeled radiopharmaceuticals in times appropriate for routine preclinical experimentation of <1 h per projection data set. Detailed experimental protocols and images of the aforementioned organs are shown. We plan to extend the instrument's field of view to fix larger animals and to combine data from both detectors to reduce the acquisition time or applied activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Negotiating trade agreements is an important part of government trade policies, economic planning and part of the globally operating trading system of today. European Union and the United States have been active in the formation of trade agreements in global comparison. Now these two economic giants are engaged in negotiations to form their own trade agreement, the so called Transnational Trade and Investment Partnership (TTIP). The purpose of this thesis is to understand the reasons for making a trade agreement between two economic areas and understanding the issues it may include in the case of the TTIP. The TTIP has received a great deal of attention in the media. The opinions towards the partnership have been extreme, and the debate has been heated. The purpose of this study is to introduce the nature of the public discussion regarding the TTIP from Spring 2013 until 2014. The research problem is to find out what are the main issues in the agreement and what are the values influencing them. The study was conducted applying methods of critical discourse analysis to the chosen data. This includes gathering the issues from the data based on the attention each has received in the discussion. The underlying motives for raising different issues were analysed by investigating the authors’ position in the political, economic and social circuits. The perceived economic impacts of the TTIP are also under analysis with the same criteria. Some of the most respected economic newspapers globally were included in the research material as well as papers or reports published by the EU and global organisations. The analysis indicates a clear dichotomy of the attitudes towards the TTIP. Key problems include lack of transparency in the negotiations, the misunderstood investor-state dispute settlement, the constantly expanding regulatory issues and the risk of protectionism. The theory and data does suggest that the removal of tariffs is an effective tool for reaching economic gains in the TTIP and even more effective would be the reducing of non-tariff barriers, such as protectionism. Critics are worried over the rising influence of corporations over governments. The discourse analysis reveals that the supporters of the TTIP have values related to increasing welfare through economic growth. Critics do not deny the economic benefits but raise the question of inequality as a consequence. Overall they represent softer values such as sustainable development and democracy as a counter-attack to the corporate values of efficiency and the maximising of profits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sleep is important for the recovery of a critically ill patient, as lack of sleep is known to influence negatively a person’s cardiovascular system, mood, orientation, and metabolic and immune function and thus, it may prolong patients’ intensive care unit (ICU) and hospital stay. Intubated and mechanically ventilated patients suffer from fragmented and light sleep. However, it is not known well how non-intubated patients sleep. The evaluation of the patients’ sleep may be compromised by their fatigue and still position with no indication if they are asleep or not. The purpose of this study was to evaluate ICU patients’ sleep evaluation methods, the quality of non-intubated patients’ sleep, and the sleep evaluations performed by ICU nurses. The aims were to develop recommendations of patients’ sleep evaluation for ICU nurses and to provide a description of the quality of non-intubated patients’ sleep. The literature review of ICU patients’ sleep evaluation methods was extended to the end of 2014. The evaluation of the quality of patients’ sleep was conducted with four data: A) the nurses’ narrative documentations of the quality of patients’ sleep (n=114), B) the nurses’ sleep evaluations (n=21) with a structured observation instrument C) the patients’ self-evaluations (n=114) with the Richards-Campbell Sleep Questionnaire, and D) polysomnographic evaluations of the quality of patients’ sleep (n=21). The correspondence of data A with data C (collected 4–8/2011), and data B with data D (collected 5–8/2009) were analysed. Content analysis was used for the nurses’ documentations and statistical analyses for all the other data. The quality of non-intubated patients’ sleep varied between individuals. In many patients, sleep was light, awakenings were frequent, and the amount of sleep was insufficient as compared to sleep in healthy people. However, some patients were able to sleep well. The patients evaluated the quality of their sleep on average neither high nor low. Sleep depth was evaluated to be the worst and the speed of falling asleep the best aspect of sleep, on a scale 0 (poor sleep) to 100 (good sleep). Nursing care was mostly performed while the patients were awake, and thus the disturbing effect was low. The instruments available for nurses to evaluate the quality of patients’ sleep were limited and measured mainly the quantity of sleep. Nurses’ structured observatory evaluations of the quality of patients’ sleep were correct for approximately two thirds of the cases, and only regarding total sleep time. Nurses’ narrative documentations of the patients’ sleep corresponded with patients’ self-evaluations in just over half of the cases. However, nurses documented several dimensions of sleep that are not included in the present sleep evaluation instruments. They could be classified according to the components of the nursing process: needs assessment, sleep assessment, intervention, and effect of intervention. Valid, more comprehensive sleep evaluation methods for nurses are needed to evaluate, document, improve and study patients’ quality of sleep.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The emerging technologies have recently challenged the libraries to reconsider their role as a mere mediator between the collections, researchers, and wider audiences (Sula, 2013), and libraries, especially the nationwide institutions like national libraries, haven’t always managed to face the challenge (Nygren et al., 2014). In the Digitization Project of Kindred Languages, the National Library of Finland has become a node that connects the partners to interplay and work for shared goals and objectives. In this paper, I will be drawing a picture of the crowdsourcing methods that have been established during the project to support both linguistic research and lingual diversity. The National Library of Finland has been executing the Digitization Project of Kindred Languages since 2012. The project seeks to digitize and publish approximately 1,200 monograph titles and more than 100 newspapers titles in various, and in some cases endangered Uralic languages. Once the digitization has been completed in 2015, the Fenno-Ugrica online collection will consist of 110,000 monograph pages and around 90,000 newspaper pages to which all users will have open access regardless of their place of residence. The majority of the digitized literature was originally published in the 1920s and 1930s in the Soviet Union, and it was the genesis and consolidation period of literary languages. This was the era when many Uralic languages were converted into media of popular education, enlightenment, and dissemination of information pertinent to the developing political agenda of the Soviet state. The ‘deluge’ of popular literature in the 1920s to 1930s suddenly challenged the lexical orthographic norms of the limited ecclesiastical publications from the 1880s onward. Newspapers were now written in orthographies and in word forms that the locals would understand. Textbooks were written to address the separate needs of both adults and children. New concepts were introduced in the language. This was the beginning of a renaissance and period of enlightenment (Rueter, 2013). The linguistically oriented population can also find writings to their delight, especially lexical items specific to a given publication, and orthographically documented specifics of phonetics. The project is financially supported by the Kone Foundation in Helsinki and is part of the Foundation’s Language Programme. One of the key objectives of the Kone Foundation Language Programme is to support a culture of openness and interaction in linguistic research, but also to promote citizen science as a tool for the participation of the language community in research. In addition to sharing this aspiration, our objective within the Language Programme is to make sure that old and new corpora in Uralic languages are made available for the open and interactive use of the academic community as well as the language societies. Wordlists are available in 17 languages, but without tokenization, lemmatization, and so on. This approach was verified with the scholars, and we consider the wordlists as raw data for linguists. Our data is used for creating the morphological analyzers and online dictionaries at the Helsinki and Tromsø Universities, for instance. In order to reach the targets, we will produce not only the digitized materials but also their development tools for supporting linguistic research and citizen science. The Digitization Project of Kindred Languages is thus linked with the research of language technology. The mission is to improve the usage and usability of digitized content. During the project, we have advanced methods that will refine the raw data for further use, especially in the linguistic research. How does the library meet the objectives, which appears to be beyond its traditional playground? The written materials from this period are a gold mine, so how could we retrieve these hidden treasures of languages out of the stack that contains more than 200,000 pages of literature in various Uralic languages? The problem is that the machined-encoded text (OCR) contains often too many mistakes to be used as such in research. The mistakes in OCRed texts must be corrected. For enhancing the OCRed texts, the National Library of Finland developed an open-source code OCR editor that enabled the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary to implement, since these rare and peripheral prints did often include already perished characters, which are sadly neglected by the modern OCR software developers, but belong to the historical context of kindred languages and thus are an essential part of the linguistic heritage (van Hemel, 2014). Our crowdsourcing tool application is essentially an editor of Alto XML format. It consists of a back-end for managing users, permissions, and files, communicating through a REST API with a front-end interface—that is, the actual editor for correcting the OCRed text. The enhanced XML files can be retrieved from the Fenno-Ugrica collection for further purposes. Could the crowd do this work to support the academic research? The challenge in crowdsourcing lies in its nature. The targets in the traditional crowdsourcing have often been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguists are not necessarily met. Also, the remarkable downside is the lack of shared goal or the social affinity. There is no reward in the traditional methods of crowdsourcing (de Boer et al., 2012). Also, there has been criticism that digital humanities makes the humanities too data-driven and oriented towards quantitative methods, losing the values of critical qualitative methods (Fish, 2012). And on top of that, the downsides of the traditional crowdsourcing become more imminent when you leave the Anglophone world. Our potential crowd is geographically scattered in Russia. This crowd is linguistically heterogeneous, speaking 17 different languages. In many cases languages are close to extinction or longing for language revitalization, and the native speakers do not always have Internet access, so an open call for crowdsourcing would not have produced appeasing results for linguists. Thus, one has to identify carefully the potential niches to complete the needed tasks. When using the help of a crowd in a project that is aiming to support both linguistic research and survival of endangered languages, the approach has to be a different one. In nichesourcing, the tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for complex tasks with high-quality product expectations found in nichesourcing. Communities have a purpose and identity, and their regular interaction engenders social trust and reputation. These communities can correspond to research more precisely (de Boer et al., 2012). Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. In nichesourcing, we hand in such assignments that would precisely fill the gaps in linguistic research. A typical task would be editing and collecting the words in such fields of vocabularies where the researchers do require more information. For instance, there is lack of Hill Mari words and terminology in anatomy. We have digitized the books in medicine, and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with the OCR editor. From the nichesourcing’s perspective, it is essential that altruism play a central role when the language communities are involved. In nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit from the results. For instance, the corrected words in Ingrian will be added to an online dictionary, which is made freely available for the public, so the society can benefit, too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of ‘two masters’: research and society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The determination of the sterilization value for low acid foods in retorts includes a critical evaluation of the factory's facilities and utilities, validation of the heat processing equipment (by heat distribution assays), and finally heat penetration assays with the product. The intensity of the heat process applied to the food can be expressed by the Fo value (sterilization value, in minutes, at a reference temperature of 121.1 °C, and a thermal index, z, of 10 °C, for Clostridium botulinum spores). For safety reasons, the lowest value for Fo is frequently adopted, being obtained in heat penetration assays as indicative of the minimum process intensity applied. This lowest Fo value should always be higher than the minimum Fo recommended for the food in question. However, the use of the Fo value for the coldest can fail to statistically explain all the practical occurrences in food heat treatment processes. Thus, as a result of intense experimental work, we aimed to develop a new focus to determine the lowest Fo value, which we renamed the critical Fo. The critical Fo is based on a statistical model for the interpretation of the results of heat penetration assays in packages, and it depends not only on the Fo values found at the coldest point of the package and the coldest point of the equipment, but also on the size of the batch of packages processed in the retort, the total processing time in the retort, and the time between CIPs of the retort. In the present study, we tried to explore the results of physical measurements used in the validation of food heat processes. Three examples of calculations were prepared to illustrate the methodology developed and to introduce the concept of critical Fo for the processing of canned food.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aims of this study were to investigate the hygienic practices in the food production of an institutional foodservice unit in Southern Brazil and to evaluate the effect of implementing good food handling practices and standard operational procedures using microbiological hygiene indicators. An initial survey of the general operating conditions classified the unit as regular in terms of compliance with State safety guidelines for food service establishments. An action plan that incorporated the correction of noncompliance issues and the training of food handlers in good food handling practices and standard operational procedures were then implemented. The results of the microbiological analysis of utensils, preparation surfaces, food handlers' hands, water, and ambient air were recorded before and after the implementation of the action plan. The results showed that the implementation of this type of practice leads to the production of safer foods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP) in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP) plan resulted in the detection of two critical control points (CCPs) including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP) was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP) plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP) hazards.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates societal effectiveness of transport sector’s Research & Development (R&D) operations. In this study effectiveness refers to organization’s capability to produce the intended and desired impacts through its operations. The aim of this study is to identify the motives for evaluating societal effectiveness and recognize the critical success factors for improving effectiveness. The theoretical framework focuses first in the policy context of effectiveness evaluation in public sector and secondly the framework introduces the concept and process of effectiveness evaluation. The empirical part is carried out as a case study, which investigates societal effectiveness of Finnish Transport Agency’s (FTA’s) R&D. The aim is to recognize FTA’s critical success factors for improving R&D operations’ societal effectiveness. Based on these factors, the organization is able to define indicators for measuring effectiveness in the future operations. In this study societal effectiveness is investigated from R&D purchasers’ and R&D end- users’ points of views according to Purchaser-Provider-model. The results indicate that societal effectiveness evaluation is important part of R&D operations, but the implementation of the evaluation as part of daily operations is challenging. Because of limited resources, the organization is forced to strong priorization and therefore R&D tasks are secondary after the operational tasks. Based on the results the critical success factors can be recognized as resources and priorization, clear strategy and objectives, internal communications, cooperation between public and private sector and R&D implementation and dissemination.