612 resultados para broca-do-cacho


Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the drilling of oil and natural gas are generated solid waste, liquid and gaseous. These solid fragments, which are known as cuttings, are carried to the surface through the drilling fluid. Furthermore, this fluid serves to cool the bit, keeping the internal pressure of the well, and others. This solid residue is very polluting, because it has incorporated beyond the drilling fluid, which has several chemical additives harmful to the environment, some heavy metals that are harmful to the environment, such as lead. To minimize the residue generated, are currently being studied numerous techniques to mitigate the problems that such waste can cause to the environment, like addition of cuttings in the composition of soil cement brick masonry construction, addition of cuttings on the clay matrix for the manufacture of solid masonry bricks and ceramic blocks and coprocessing of the cuttings in cement. So, the main objective of this work is the incorporation of cuttings drilling of oil wells, the cement slurry used in the cementing operation of the well. This cuttings used in this study, arising from the formation Pendências, was milled and separated in a sieve of 100 mesh. After grinding had a mean particle sike in order of 86 mm and crystal structure containing phases of quartz and calcite type, characteristic of the Portland cement. Were formulated and prepared slurries of cement with density 13 lb / gal, containing different concentrations of gravel, and realized characterization tests API SPEC 10A and RP 10B. Free water tests showed values lower than 5.9% and the rheological model that best described the behavior of the mixtures was the power. The results of compressive strength (10.3 MPa) and stability (Dr <0.5 lb / gal) had values within the set of operational procedures. Thus, the gravel from the drilling operation, may be used as binders in addition to Portland cement oil wells, in order to reuse this waste and reduce the cost of the cement paste.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Drilling fluids have fundamental importance in the petroleum activities, since they are responsible for remove the cuttings, maintain pressure and well stability, preventing collapse and inflow of fluid into the rock formation and maintain lubrication and cooling the drill. There are basically three types of drilling fluids: water-based, non-aqueous and aerated based. The water-based drilling fluid is widely used because it is less aggressive to the environment and provide excellent stability and inhibition (when the water based drilling fluid is a inhibition fluid), among other qualities. Produced water is generated simultaneously with oil during production and has high concentrations of metals and contaminants, so it’s necessary to treat for disposal this water. The produced water from the fields of Urucu-AM and Riacho da forquilha-RN have high concentrations of contaminants, metals and salts such as calcium and magnesium, complicating their treatment and disposal. Thus, the objective was to analyze the use of synthetic produced water with similar characteristics of produced water from Urucu-AM and Riacho da Forquilha-RN for formulate a water-based drilling mud, noting the influence of varying the concentration of calcium and magnesium into filtered and rheology tests. We conducted a simple 32 factorial experimental design for statistical modeling of data. The results showed that the varying concentrations of calcium and magnesium did not influence the rheology of the fluid, where in the plastic viscosity, apparent viscosity and the initial and final gels does not varied significantly. For the filtrate tests, calcium concentration in a linear fashion influenced chloride concentration, where when we have a higher concentration of calcium we have a higher the concentration of chloride in the filtrate. For the Urucu’s produced water based fluids, volume of filtrate was observed that the calcium concentration influences quadratically, this means that high calcium concentrations interfere with the power of the inhibitors used in the formulation of the filtered fluid. For Riacho’s produced water based fluid, Calcium’s influences is linear for volume of filtrate. The magnesium concentration was significant only for chloride concentration in a quadratic way just for Urucu’s produced water based fluids. The mud with maximum concentration of magnesium (9,411g/L), but minimal concentration of calcium (0,733g/L) showed good results. Therefore, a maximum water produced by magnesium concentration of 9,411g/L and the maximum calcium concentration of 0,733g/L can be used for formulating water-based drilling fluids, providing appropriate properties for this kind of fluid.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless Sensor and Actuator Networks (WSAN) are a key component in Ubiquitous Computing Systems and have many applications in different knowledge domains. Programming for such networks is very hard and requires developers to know the available sensor platforms specificities, increasing the learning curve for developing WSAN applications. In this work, an MDA (Model-Driven Architecture) approach for WSAN applications development called ArchWiSeN is proposed. The goal of such approach is to facilitate the development task by providing: (i) A WSAN domain-specific language, (ii) a methodology for WSAN application development; and (iii) an MDA infrastructure composed of several software artifacts (PIM, PSMs and transformations). ArchWiSeN allows the direct contribution of domain experts in the WSAN application development without the need of specialized knowledge on WSAN platforms and, at the same time, allows network experts to manage the application requirements without the need for specific knowledge of the application domain. Furthermore, this approach also aims to enable developers to express and validate functional and non-functional requirements of the application, incorporate services offered by WSAN middleware platforms and promote reuse of the developed software artifacts. In this sense, this Thesis proposes an approach that includes all WSAN development stages for current and emerging scenarios through the proposed MDA infrastructure. An evaluation of the proposal was performed by: (i) a proof of concept encompassing three different scenarios performed with the usage of the MDA infrastructure to describe the WSAN development process using the application engineering process, (ii) a controlled experiment to assess the use of the proposed approach compared to traditional method of WSAN application development, (iii) the analysis of ArchWiSeN support of middleware services to ensure that WSAN applications using such services can achieve their requirements ; and (iv) systematic analysis of ArchWiSeN in terms of desired characteristics for MDA tool when compared with other existing MDA tools for WSAN.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High dependability, availability and fault-tolerance are open problems in Service-Oriented Architecture (SOA). The possibility of generating software applications by integrating services from heterogeneous domains, in a reliable way, makes worthwhile to face the challenges inherent to this paradigm. In order to ensure quality in service compositions, some research efforts propose the adoption of verification techniques to identify and correct errors. In this context, exception handling is a powerful mechanism to increase SOA quality. Several research works are concerned with mechanisms for exception propagation on web services, implemented in many languages and frameworks. However, to the extent of our knowledge, no works found evaluates these mechanisms in SOA with regard to the .NET framework. The main contribution of this paper is to evaluate and to propose exception propagation mechanisms in SOA to applications developed within the .NET framework. In this direction, this work: (i)extends a previous study, showing the need to propose a solution to the exception propagation in SOA to applications developed in .NET, and (ii) show a solution, based in model obtained from the results found in (i) and that will be applied in real cases through of faults injections and AOP techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cloud computing can be defined as a distributed computational model by through resources (hardware, storage, development platforms and communication) are shared, as paid services accessible with minimal management effort and interaction. A great benefit of this model is to enable the use of various providers (e.g a multi-cloud architecture) to compose a set of services in order to obtain an optimal configuration for performance and cost. However, the multi-cloud use is precluded by the problem of cloud lock-in. The cloud lock-in is the dependency between an application and a cloud platform. It is commonly addressed by three strategies: (i) use of intermediate layer that stands to consumers of cloud services and the provider, (ii) use of standardized interfaces to access the cloud, or (iii) use of models with open specifications. This paper outlines an approach to evaluate these strategies. This approach was performed and it was found that despite the advances made by these strategies, none of them actually solves the problem of lock-in cloud. In this sense, this work proposes the use of Semantic Web to avoid cloud lock-in, where RDF models are used to specify the features of a cloud, which are managed by SPARQL queries. In this direction, this work: (i) presents an evaluation model that quantifies the problem of cloud lock-in, (ii) evaluates the cloud lock-in from three multi-cloud solutions and three cloud platforms, (iii) proposes using RDF and SPARQL on management of cloud resources, (iv) presents the cloud Query Manager (CQM), an SPARQL server that implements the proposal, and (v) comparing three multi-cloud solutions in relation to CQM on the response time and the effectiveness in the resolution of cloud lock-in.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nicotine administration in humans and rodents enhances memory and attention, and also has a positive effect in Alzheimer's Disease. The Medial Septum / Diagonal Band of Broca complex (MS/DBB) – a main cholinergic system – massively projects to the hippocampus through the fimbria-fornix, and this pathway is called the septohippocampal pathway. It has been demonstrated that the MS/DBB acts directly on the local field potential (LFP) rhythmic organization of the hippocampus, especially in the rhythmogenesis of Theta (4-8Hz) – an oscillation intrinsically linked to hippocampus mnemonic function. In vitro experiments gave evidence that nicotine applied to the MS/DBB generates a local network Theta rhythm within the MS/DBB. Thus, the present study proposes to elucidate the function of nicotine in the MS/DBB on the septo-hippocampal pathway. In vivo experiments compared the effect of MS/DBB microinfusion of saline (n=5) and nicotine (n=8) on Ketamine/Xylazine anaesthetized mice. We observed power spectrum density in the Gamma range (35 to 55 Hz) increasing in both structures (Wilcoxon Rank-Sum test, p=0.038) but with no change in coherence between these structures in the same range (Wilcoxon Rank-Sum test, p=0.60). There was also a decrease in power of the ketamineinduced Delta oscillation (1 to 3 Hz). We also performed in vitro experiments on the effect of nicotine on membrane voltage and action potential. We patch-clamped 22 neurons in current-clamp mode; 12 neurons were responsive to nicotine, half of them increased firing rate and other 6 decreased, and they significantly differed in action potential threshold (-47.3±0.9 mV vs. -41±1.9 mV, respectively, p=0.007) and halfwidth time (1.6±0.08 ms vs. 2±0.12 ms, respectively, p=0.01). Furthermore, we performed another set of in vitro experiments concerning the connectivity of the three major neuronal populations of MS/DBB that use acetylcholine, GABA or glutamate as neurotransmitter. Paired patch-clamp recordings found that glutamatergic and GABAergic neurons realize intra-septal connections that produce sizable currents in MS/DBB postsynaptic neurons. The probability of connectivity between different neuronal populations gave rise to a MS/DBB topology that was implemented in a realistic model, which corroborates that the network is highly sensitive to the generation of Gamma rhythm. Together, the data available in the full set of experiments suggests that nicotine may act as a cognitive enhancer, by inducing gamma oscillation in the local circuitry of the MS/DBB.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-Cloud Applications are composed of services offered by multiple cloud platforms where the user/developer has full knowledge of the use of such platforms. The use of multiple cloud platforms avoids the following problems: (i) vendor lock-in, which is dependency on the application of a certain cloud platform, which is prejudicial in the case of degradation or failure of platform services, or even price increasing on service usage; (ii) degradation or failure of the application due to fluctuations in quality of service (QoS) provided by some cloud platform, or even due to a failure of any service. In multi-cloud scenario is possible to change a service in failure or with QoS problems for an equivalent of another cloud platform. So that an application can adopt the perspective multi-cloud is necessary to create mechanisms that are able to select which cloud services/platforms should be used in accordance with the requirements determined by the programmer/user. In this context, the major challenges in terms of development of such applications include questions such as: (i) the choice of which underlying services and cloud computing platforms should be used based on the defined user requirements in terms of functionality and quality (ii) the need to continually monitor the dynamic information (such as response time, availability, price, availability), related to cloud services, in addition to the wide variety of services, and (iii) the need to adapt the application if QoS violations affect user defined requirements. This PhD thesis proposes an approach for dynamic adaptation of multi-cloud applications to be applied when a service is unavailable or when the requirements set by the user/developer point out that other available multi-cloud configuration meets more efficiently. Thus, this work proposes a strategy composed of two phases. The first phase consists of the application modeling, exploring the similarities representation capacity and variability proposals in the context of the paradigm of Software Product Lines (SPL). In this phase it is used an extended feature model to specify the cloud service configuration to be used by the application (similarities) and the different possible providers for each service (variability). Furthermore, the non-functional requirements associated with cloud services are specified by properties in this model by describing dynamic information about these services. The second phase consists of an autonomic process based on MAPE-K control loop, which is responsible for selecting, optimally, a multicloud configuration that meets the established requirements, and perform the adaptation. The adaptation strategy proposed is independent of the used programming technique for performing the adaptation. In this work we implement the adaptation strategy using various programming techniques such as aspect-oriented programming, context-oriented programming and components and services oriented programming. Based on the proposed steps, we tried to assess the following: (i) the process of modeling and the specification of non-functional requirements can ensure effective monitoring of user satisfaction; (ii) if the optimal selection process presents significant gains compared to sequential approach; and (iii) which techniques have the best trade-off when compared efforts to development/modularity and performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The spread of wireless networks and growing proliferation of mobile devices require the development of mobility control mechanisms to support the different demands of traffic in different network conditions. A major obstacle to developing this kind of technology is the complexity involved in handling all the information about the large number of Moving Objects (MO), as well as the entire signaling overhead required to manage these procedures in the network. Despite several initiatives have been proposed by the scientific community to address this issue they have not proved to be effective since they depend on the particular request of the MO that is responsible for triggering the mobility process. Moreover, they are often only guided by wireless medium statistics, such as Received Signal Strength Indicator (RSSI) of the candidate Point of Attachment (PoA). Thus, this work seeks to develop, evaluate and validate a sophisticated communication infrastructure for Wireless Networking for Moving Objects (WiNeMO) systems by making use of the flexibility provided by the Software-Defined Networking (SDN) paradigm, where network functions are easily and efficiently deployed by integrating OpenFlow and IEEE 802.21 standards. For purposes of benchmarking, the analysis was conducted in the control and data planes aspects, which demonstrate that the proposal significantly outperforms typical IPbased SDN and QoS-enabled capabilities, by allowing the network to handle the multimedia traffic with optimal Quality of Service (QoS) transport and acceptable Quality of Experience (QoE) over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Gait after stroke is characterized by a significant asymmetry between the lower limbs, with predominant use of the non-paretic lower limb (NPLL) over using the paretic lower limb. Accordingly, it has been suggested that adding load/weight to the NPLL as a form of restricting the movement of this limb may favor the use of the paretic limb, reducing interlimb asymmetry. However, few studies have been conducted up to this moment, which only investigated the immediate effects of this practice. Objectives: 1) Investigating whether there is an influence of adding load to the NPLL during treadmill training on cardiovascular parameters and on gait performance of individuals with stroke, compared to treadmill training without load addition; 2) Analyzing the effects of treadmill training with and without load added to the NPLL on kinematic parameters of each lower limb during gait; 3) Analyzing the effects of treadmill training with and without load added to the NPLL on measurements of functional mobility and postural balance of these patients. Materials and Methods: This is a randomized single blinded clinical trial involving 38 subjects, with a mean age of 56.5 years, at the subacute post-stroke phase (with mean time since stroke of 4.5 months). Participants were randomly assigned into an experimental group (EG) or control group (CG). EG (n= 19) was submitted to gait training on a treadmill with the addition of load to the NPLL by ankle weights equivalent to 5% of body weight. CG (n= 19) was only submitted to gait training on a treadmill. Behavioral strategies which included home exercises were also applied to both groups. The interventions occurred daily for two consecutive weeks (Day 1 to Day 9), being of 30 minutes duration each. Outcome measures: postural balance (Berg Functional Balance Scale – BBS), functional mobility (Timed Up and Go – TUG; kinematic variables of 180° turning) and kinematic gait variables were assessed at baseline (Day 0), after four training sessions (Day 4), after nine training sessions (Day 9), and 40 days after completion of training (Follow-up). Cardiovascular parameters (mean arterial pressure and heart rate) were evaluated at four moments within each training session. Analysis of variance (ANOVA) was used to compare outcomes between EG and CG in the course of the study (Day 0, Day 4, Day 9 and Follow-up). Unpaired t-tests allowed for intergroup comparison at each training session. 5% significance was used for all tests. Results: 1) Cardiovascular parameters (systemic arterial pressure, heart rate and derivated variables) did not change after the interventions and there were no differences between groups within each training session. There was an improvement in gait performance, with increased speed and distance covered, with no statistically significant difference between groups. 2) After the interventions, patients had increased paretic and non-paretic step lengths, in addition to exhibiting greater hip and knee joint excursion on both lower limbs. The gains were observed in the EG and CG, with no statistical difference between the groups and (mostly) maintained at follow-up. 3) After the interventions, patients showed better postural balance (higher scores on BBS) and functional mobility (reduced time spent on the TUG test and better performance on the 180° turning). All gains were observed in the EG and CG, with no statistically significant difference between groups and were maintained at follow-up. Conclusions: The addition of load to the NPLL did not affect cardiovascular parameters in patients with subacute stroke, similar to treadmill training without load, thus seemingly a safe training to be applied to these patients. However, the use of the load did not bring any additional benefits to gait training. The gait training program (nine training sessions on a treadmill + strategies and exercises for paretic limb stimulation) was useful for improving gait performance and kinematics, functional mobility and postural balance, and its use is suggested to promote the optimization of these outcomes in the subacute phase after stroke.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sea surface temperature (SST) profiles over the last 25 kyr derived from alkenone measurements are studied in four cores from a W-E latitudinal transect encompassing the Gulf of Cadiz (Atlantic Ocean), the Alboran Sea, and the southern Tyrrhenian Sea (western Mediterranean). The results document the sensitivity of the Mediterranean region to the short climatic changes of the North Atlantic Ocean, particularly those involving the latitudinal position of the polar front. The amplitude of the SST oscillations increases toward the Tyrrhenian Sea, indicating an amplification effect of the Atlantic signal by the climatic regime of the Mediterranean region. All studied cores show a shorter cooling phase (700 years) for the Younger Dryas (YD) than that observed in the North Atlantic region (1200 years). This time diachroneity is related to an intra-YD climatic change documented in the European continent. Minor oscillations in the southward displacement of the North Atlantic polar front may also have driven this early warming in the studied area. During the Holocene a regional diachroneity propagating west to east is observed for the SST maxima, 11.5-10.2 kyr B.P. in the Gulf of Cadiz, 10-9 kyr B.P. in the Alboran Sea, and 8.9-8.4 kyr B.P. in the Thyrrenian Sea. A general cooling trend from these SST maxima to present day is observed during this stage, which is marked by short cooling oscillations with a periodicity of 730±40 years and its harmonics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose:  To study the concentrations of diadenosine polyphosphates in the ocular surface after PRK and LASIK. Methods:  Sixty-one patients (30 males and 31 females) with ages ranging from 20 to 63 (34.04 ± 9.13 years) were recruited in Balear Institute of Ophthalmology, Palma de Mallorca, Spain. LASIK was performed in 92 eyes of 46 patients and PRK in 25 eyes of 15 patients. Variations in the levels of diadenosine polyphosphate (Ap4A and Ap5A), Schirmer I (Jones test), TBUT, corneal staining together with the Dry Eye Questionnaire to evaluate discomfort and dryness were studied. All tests were performed at the preoperative visit and at 1-day, 2-week, 1-month and 3-month postoperative visits. Results:  Ap4A showed a 5 and 3.5 fold increase at the 1-day visit for LASIK and PRK, respectively. LASIK patients continued having higher statistically significant concentrations (p = 0.01) all over the follow-up. Ap5A showed no significant differences at any visit. Tear volume decreased during the 3 months in LASIK. The PRK cases had a normal volume at 1 month. TBUT in LASIK increased at the 1-day visit (p = 0,002) and decreased from the 2 weeks onwards and for the PRK, decreased by a 35% at the 1-day visit and kept reduced for a month. Discomfort only increased at the 1-day visit (p = 0.007). Dryness frequency was similar in all visits. Conclusions:  Ap4A levels only are increased in refractive surgery patients during the first day after the surgery. This increasing suggests that Ap4A may help accelerating the healing process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, blood oxygen level-dependent (BOLD) functional magnetic resonance imaging (fMRI) has become a routine clinical procedure for localization of language and motor brain regions and has been replacing more invasive preoperative procedures. However, the fMRI results from these tasks are not always reproducible even from the same patient. Evaluating the reproducibility of language and speech mapping is especially complicated due to the complex brain circuitry that may become activated during the functional task. Non-language areas such as sensory, attention, decision-making, and motor brain regions may also be activated in addition to the specific language regions during a traditional sentence-completion task. In this study, I test a new approach, which utilizes 4-minute video-based tasks, to map language and speech brain regions for patients undergoing brain surgery. Results from 35 subjects have shown that the video-based task activates Wernicke’s area, as well as Broca’s area in most subjects. The computed laterality indices, which indicate the dominant hemisphere from that functional task, have indicated left dominance from the video-based tasks. This study has shown that the video-based task may be an alternative method for localization of language and speech brain regions for patients who are unable to complete the sentence-completion task.