989 resultados para Critical Initial Approximations
Resumo:
This article proposes a comprehensive view of the origin of the mammalian brain. We discuss i) from which region in the brain of a reptilian-like ancestor did the isocortex originate, and ii) the origin of the multilayered structure of the isocortex from a simple-layered structure like that observed in the cortex of present-day reptiles. Regarding question i there have been two alternative hypotheses, one suggesting that most or all the isocortex originated from the dorsal pallium, and the other suggesting that part of the isocortex originated from a ventral pallial component. The latter implies that a massive tangential migration of cells from the ventral pallium to the dorsal pallium takes place in isocortical development, something that has not been shown. Question ii refers to the origin of the six-layered isocortex from a primitive three-layered cortex. It is argued that the superficial isocortical layers can be considered to be an evolutionary acquisition of the mammalian brain, since no equivalent structures can be found in the reptilian brain. Furthermore, a characteristic of the isocortex is that it develops according to an inside-out neurogenetic gradient, in which late-produced cells migrate past layers of early-produced cells. It is proposed that the inside-out neurogenetic gradient was partly achieved by the activation of a signaling pathway associated with the Cdk5 kinase and its activator p35, while an extracellular protein called reelin (secreted in the marginal zone during development) may have prevented migrating cells from penetrating into the developing marginal zone (future layer I).
Resumo:
Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.
Resumo:
We conducted a retrospective analysis of the influence of full doses of calcineurin inhibitors [8-10 mg kg-1 day-1 cyclosporine (N = 80), or 0.2-0.3 mg kg-1 day-1 tacrolimus (N = 68)] administered from day 1 after transplantation on the transplant outcomes of a high-risk population. Induction therapy was used in 13% of the patients. Patients also received azathioprine (2 mg kg-1 day-1, N = 58) or mycophenolate mofetil (2 g/day, N = 90), and prednisone (0.5 mg kg-1 day-1, N = 148). Mean time on dialysis was 79 ± 41 months, 12% of the cases were re-transplants, and 21% had panel reactive antibodies >10%. In 43% of donors the cause of death was cerebrovascular disease and 27% showed creatinine above 1.5 mg/dL. The incidence of slow graft function (SGF) and delayed graft function (DGF) was 15 and 60%, respectively. Mean time to last dialysis and to nadir creatinine were 18 ± 15 and 34 ± 20 days, respectively. Mean creatinine at 1 year after transplantation was 1.48 ± 0.50 mg/dL (DGF 1.68 ± 0.65 vs SGF 1.67 ± 0.66 vs immediate graft function (IGF) 1.41 ± 0.40 mg/dL, P = 0.089). The incidence of biopsy-confirmed acute rejection was 22% (DGF 31%, SGF 10%, IGF 8%). One-year patient and graft survival was 92.6 and 78.4%, respectively. The incidence of cytomegalovirus disease, post-transplant diabetes mellitus and malignancies was 28, 8.1, and 0%, respectively. Compared to previous studies, the use of initial full doses of calcineurin inhibitors without antibody induction in patients with SGF or DGF had no negative impact on patient and graft survival.
Resumo:
A concurrent prospective study was conducted from 2001 to 2003 to assess factors associated with adverse reactions among individuals initiating antiretroviral therapy at two public referral HIV/AIDS centers in Belo Horizonte, MG, Brazil. Adverse reactions were obtained from medical charts reviewed up to 12 months after the first antiretroviral prescription. Cox proportional hazard model was used to perform univariate and multivariate analyses. Relative hazards (RH) were estimated with 95% confidence intervals (CI). Among 397 charts reviewed, 377 (95.0%) had precise information on adverse reactions and initial antiretroviral treatment. Most patients received triple combination regimens including nucleoside reverse transcriptase inhibitors, non-nucleoside reverse transcriptase inhibitors and protease inhibitors. At least one adverse reaction was recorded on 34.5% (N = 130) of the medical charts (0.17 adverse reactions/100 person-day), while nausea (14.5%) and vomiting (13.1%) were the most common ones. Variables independently associated with adverse reactions were: regimens with nevirapine (RH = 1.78; 95% CI = 1.07-2.96), indinavir or indinavir/ritonavir combinations (RH = 2.05; 95% CI = 1.15-3.64), female patients (RH = 1.93; 95% CI = 1.31-2.83), 5 or more outpatient visits (RH = 1.94; 95% CI = 1.25-3.01), non-adherence to antiretroviral therapy (RH = 2.38; 95% CI = 1.62-3.51), and a CD4+ count of 200 to 500 cells/mm³ (RH = 2.66; 95% CI = 1.19-5.90). An independent and negative association was also found for alcohol use (RH = 0.55; 95% CI = 0.33-0.90). Adverse reactions were substantial among participants initiating antiretroviral therapy. Specially elaborated protocols in HIV/AIDS referral centers may improve the diagnosis, management and prevention of adverse reactions, thus contributing to improving adherence to antiretroviral therapy among HIV-infected patients.
Resumo:
Since there are some concerns about the effectiveness of highly active antiretroviral therapy in developing countries, we compared the initial combination antiretroviral therapy with zidovudine and lamivudine plus either nelfinavir or efavirenz at a university-based outpatient service in Brazil. This was a retrospective comparative cohort study carried out in a tertiary level hospital. A total of 194 patients receiving either nelfinavir or efavirenz were identified through our electronic database search, but only 126 patients met the inclusion criteria. Patients were included if they were older than 18 years old, naive for antiretroviral therapy, and had at least 1 follow-up visit after starting the antiretroviral regimen. Fifty-one of the included patients were receiving a nelfinavir-based regimen and 75 an efavirenz-based regimen as outpatients. Antiretroviral therapy was prescribed to all patients according to current guidelines. By intention-to-treat (missing/switch = failure), after a 12-month period, 65% of the patients in the efavirenz group reached a viral load <400 copies/mL compared to 41% of the patients in the nelfinavir group (P = 0.01). The mean CD4 cell count increase after a 12-month period was also greater in the efavirenz group (195 x 10(6) cells/L) than in the nelfinavir group (119 x 10(6) cells/L; P = 0.002). The efavirenz-based regimen was superior compared to the nelfinavir-based regimen. The low response rate in the nelfinavir group might be partially explained by the difficulty of using a regimen requiring a higher patient compliance (12 vs 3 pills a day) in a developing country.
Resumo:
The autonomic nervous system plays an important role in physiological and pathological conditions, and has been extensively evaluated by parametric and non-parametric spectral analysis. To compare the results obtained with fast Fourier transform (FFT) and the autoregressive (AR) method, we performed a comprehensive comparative study using data from humans and rats during pharmacological blockade (in rats), a postural test (in humans), and in the hypertensive state (in both humans and rats). Although postural hypotension in humans induced an increase in normalized low-frequency (LFnu) of systolic blood pressure, the increase in the ratio was detected only by AR. In rats, AR and FFT analysis did not agree for LFnu and high frequency (HFnu) under basal conditions and after vagal blockade. The increase in the LF/HF ratio of the pulse interval, induced by methylatropine, was detected only by FFT. In hypertensive patients, changes in LF and HF for systolic blood pressure were observed only by AR; FFT was able to detect the reduction in both blood pressure variance and total power. In hypertensive rats, AR presented different values of variance and total power for systolic blood pressure. Moreover, AR and FFT presented discordant results for LF, LFnu, HF, LF/HF ratio, and total power for pulse interval. We provide evidence for disagreement in 23% of the indices of blood pressure and heart rate variability in humans and 67% discordance in rats when these variables are evaluated by AR and FFT under physiological and pathological conditions. The overall disagreement between AR and FFT in this study was 43%.
Resumo:
Negotiating trade agreements is an important part of government trade policies, economic planning and part of the globally operating trading system of today. European Union and the United States have been active in the formation of trade agreements in global comparison. Now these two economic giants are engaged in negotiations to form their own trade agreement, the so called Transnational Trade and Investment Partnership (TTIP). The purpose of this thesis is to understand the reasons for making a trade agreement between two economic areas and understanding the issues it may include in the case of the TTIP. The TTIP has received a great deal of attention in the media. The opinions towards the partnership have been extreme, and the debate has been heated. The purpose of this study is to introduce the nature of the public discussion regarding the TTIP from Spring 2013 until 2014. The research problem is to find out what are the main issues in the agreement and what are the values influencing them. The study was conducted applying methods of critical discourse analysis to the chosen data. This includes gathering the issues from the data based on the attention each has received in the discussion. The underlying motives for raising different issues were analysed by investigating the authors’ position in the political, economic and social circuits. The perceived economic impacts of the TTIP are also under analysis with the same criteria. Some of the most respected economic newspapers globally were included in the research material as well as papers or reports published by the EU and global organisations. The analysis indicates a clear dichotomy of the attitudes towards the TTIP. Key problems include lack of transparency in the negotiations, the misunderstood investor-state dispute settlement, the constantly expanding regulatory issues and the risk of protectionism. The theory and data does suggest that the removal of tariffs is an effective tool for reaching economic gains in the TTIP and even more effective would be the reducing of non-tariff barriers, such as protectionism. Critics are worried over the rising influence of corporations over governments. The discourse analysis reveals that the supporters of the TTIP have values related to increasing welfare through economic growth. Critics do not deny the economic benefits but raise the question of inequality as a consequence. Overall they represent softer values such as sustainable development and democracy as a counter-attack to the corporate values of efficiency and the maximising of profits.
Resumo:
Sleep is important for the recovery of a critically ill patient, as lack of sleep is known to influence negatively a person’s cardiovascular system, mood, orientation, and metabolic and immune function and thus, it may prolong patients’ intensive care unit (ICU) and hospital stay. Intubated and mechanically ventilated patients suffer from fragmented and light sleep. However, it is not known well how non-intubated patients sleep. The evaluation of the patients’ sleep may be compromised by their fatigue and still position with no indication if they are asleep or not. The purpose of this study was to evaluate ICU patients’ sleep evaluation methods, the quality of non-intubated patients’ sleep, and the sleep evaluations performed by ICU nurses. The aims were to develop recommendations of patients’ sleep evaluation for ICU nurses and to provide a description of the quality of non-intubated patients’ sleep. The literature review of ICU patients’ sleep evaluation methods was extended to the end of 2014. The evaluation of the quality of patients’ sleep was conducted with four data: A) the nurses’ narrative documentations of the quality of patients’ sleep (n=114), B) the nurses’ sleep evaluations (n=21) with a structured observation instrument C) the patients’ self-evaluations (n=114) with the Richards-Campbell Sleep Questionnaire, and D) polysomnographic evaluations of the quality of patients’ sleep (n=21). The correspondence of data A with data C (collected 4–8/2011), and data B with data D (collected 5–8/2009) were analysed. Content analysis was used for the nurses’ documentations and statistical analyses for all the other data. The quality of non-intubated patients’ sleep varied between individuals. In many patients, sleep was light, awakenings were frequent, and the amount of sleep was insufficient as compared to sleep in healthy people. However, some patients were able to sleep well. The patients evaluated the quality of their sleep on average neither high nor low. Sleep depth was evaluated to be the worst and the speed of falling asleep the best aspect of sleep, on a scale 0 (poor sleep) to 100 (good sleep). Nursing care was mostly performed while the patients were awake, and thus the disturbing effect was low. The instruments available for nurses to evaluate the quality of patients’ sleep were limited and measured mainly the quantity of sleep. Nurses’ structured observatory evaluations of the quality of patients’ sleep were correct for approximately two thirds of the cases, and only regarding total sleep time. Nurses’ narrative documentations of the patients’ sleep corresponded with patients’ self-evaluations in just over half of the cases. However, nurses documented several dimensions of sleep that are not included in the present sleep evaluation instruments. They could be classified according to the components of the nursing process: needs assessment, sleep assessment, intervention, and effect of intervention. Valid, more comprehensive sleep evaluation methods for nurses are needed to evaluate, document, improve and study patients’ quality of sleep.
Resumo:
The determination of the sterilization value for low acid foods in retorts includes a critical evaluation of the factory's facilities and utilities, validation of the heat processing equipment (by heat distribution assays), and finally heat penetration assays with the product. The intensity of the heat process applied to the food can be expressed by the Fo value (sterilization value, in minutes, at a reference temperature of 121.1 °C, and a thermal index, z, of 10 °C, for Clostridium botulinum spores). For safety reasons, the lowest value for Fo is frequently adopted, being obtained in heat penetration assays as indicative of the minimum process intensity applied. This lowest Fo value should always be higher than the minimum Fo recommended for the food in question. However, the use of the Fo value for the coldest can fail to statistically explain all the practical occurrences in food heat treatment processes. Thus, as a result of intense experimental work, we aimed to develop a new focus to determine the lowest Fo value, which we renamed the critical Fo. The critical Fo is based on a statistical model for the interpretation of the results of heat penetration assays in packages, and it depends not only on the Fo values found at the coldest point of the package and the coldest point of the equipment, but also on the size of the batch of packages processed in the retort, the total processing time in the retort, and the time between CIPs of the retort. In the present study, we tried to explore the results of physical measurements used in the validation of food heat processes. Three examples of calculations were prepared to illustrate the methodology developed and to introduce the concept of critical Fo for the processing of canned food.
Resumo:
This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP) in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP) plan resulted in the detection of two critical control points (CCPs) including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP) was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP) plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP) hazards.