980 resultados para Resolution of problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neutrophilic granulocytes play a major role in the initiation and resolution of the inflammatory response, and demonstrate significant transcriptional and translational activity. Although much was known about neutrophils prior to the introduction of proteomics, the use of MS-based methodologies has provided an unprecedented tool to confirm and extend previous findings. In the present study, we performed a Gel-LC-MS/MS analysis of neutrophil detergent insoluble and whole cell lysate fractions of resting neutrophils. We achieved a set of identifications through the use of high-resolution mass spectrometry and validation of its data. We identified a total of 1249 proteins with a wide range of intensities from both detergent-insoluble and whole cell lysate fractions, allowing a mapping of proteins such as those involved in intracellular transport (Rab and Sec family proteins) and cell signaling (S100 proteins). These results represent the most comprehensive proteomic characterization of resting human neutrophils to date, and provide important information relevant for further studies of the immune system in health and disease. The methods applied here can be employed to help us understand how neutrophils respond to various physiologic and pathophysiologic conditions and could be extended to protein quantitation after cell activation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study Design. A randomized clinical trial with 1-year and 3-year telephone questionnaire follow-ups. Objective. To report a specific exercise intervention’s long-term effects on recurrence rates in acute, first-episode low back pain patients. Summary of Background Data. The pain and disability associated with an initial episode of acute low back pain (LBP) is known to resolve spontaneously in the short-term in the majority of cases. However, the recurrence rate is high, and recurrent disabling episodes remain one of the most costly problems in LBP. A deficit in the multifidus muscle has been identified in acute LBP patients, and does not resolve spontaneously on resolution of painful symptoms and resumption of normal activity. Any relation between this deficit and recurrence rate was investigated in the long-term. Methods. Thirty-nine patients with acute, first-episode LBP were medically managed and randomly allocated to either a control group or specific exercise group. Medical management included advice and use of medications. Intervention consisted of exercises aimed at rehabilitating the multifidus in cocontraction with the transversus abdominis muscle. One year and three years after treatment, telephone questionnaires were conducted with patients. Results. Questionnaire results revealed that patients from the specific exercise group experienced fewer recurrences of LBP than patients from the control group. One year after treatment, specific exercise group recurrence was 30%, and control group recurrence was 84% (P , 0.001). Two to three years after treatment, specific exercise group recurrence was 35%, and control group recurrence was 75% (P , 0.01). Conclusion. Long-term results suggest that specific exercise therapy in addition to medical management and resumption of normal activity may be more effective in reducing low back pain recurrences than medical management and normal activity alone. [Key Words: multifidus, low back pain, rehabilitation]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability. This allows performing longitudinal studies on the same animal and improves the accuracy of biological models. However, small animal PET still suffers from several limitations. The amounts of radiotracers needed, limited scanner sensitivity, image resolution and image quantification issues, all could clearly benefit from additional research. Because nuclear medicine imaging deals with radioactive decay, the emission of radiation energy through photons and particles alongside with the detection of these quanta and particles in different materials make Monte Carlo method an important simulation tool in both nuclear medicine research and clinical practice. In order to optimize the quantitative use of PET in clinical practice, data- and image-processing methods are also a field of intense interest and development. The evaluation of such methods often relies on the use of simulated data and images since these offer control of the ground truth. Monte Carlo simulations are widely used for PET simulation since they take into account all the random processes involved in PET imaging, from the emission of the positron to the detection of the photons by the detectors. Simulation techniques have become an importance and indispensable complement to a wide range of problems that could not be addressed by experimental or analytical approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimization problems arise in science, engineering, economy, etc. and we need to find the best solutions for each reality. The methods used to solve these problems depend on several factors, including the amount and type of accessible information, the available algorithms for solving them, and, obviously, the intrinsic characteristics of the problem. There are many kinds of optimization problems and, consequently, many kinds of methods to solve them. When the involved functions are nonlinear and their derivatives are not known or are very difficult to calculate, these methods are more rare. These kinds of functions are frequently called black box functions. To solve such problems without constraints (unconstrained optimization), we can use direct search methods. These methods do not require any derivatives or approximations of them. But when the problem has constraints (nonlinear programming problems) and, additionally, the constraint functions are black box functions, it is much more difficult to find the most appropriate method. Penalty methods can then be used. They transform the original problem into a sequence of other problems, derived from the initial, all without constraints. Then this sequence of problems (without constraints) can be solved using the methods available for unconstrained optimization. In this chapter, we present a classification of some of the existing penalty methods and describe some of their assumptions and limitations. These methods allow the solving of optimization problems with continuous, discrete, and mixing constraints, without requiring continuity, differentiability, or convexity. Thus, penalty methods can be used as the first step in the resolution of constrained problems, by means of methods that typically are used by unconstrained problems. We also discuss a new class of penalty methods for nonlinear optimization, which adjust the penalty parameter dynamically.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Redundant manipulators allow the trajectory optimization, the obstacle avoidance, and the resolution of singularities. For this type of manipulators, the kinematic control algorithms adopt generalized inverse matrices that may lead to unpredictable responses. Motivated by these problems this paper studies the complexity revealed by the trajectory planning scheme when controlling redundant manipulators. The results reveal fundamental properties of the chaotic phenomena and give a deeper insight towards the development of superior trajectory control algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hyperspectral imaging can be used for object detection and for discriminating between different objects based on their spectral characteristics. One of the main problems of hyperspectral data analysis is the presence of mixed pixels, due to the low spatial resolution of such images. This means that several spectrally pure signatures (endmembers) are combined into the same mixed pixel. Linear spectral unmixing follows an unsupervised approach which aims at inferring pure spectral signatures and their material fractions at each pixel of the scene. The huge data volumes acquired by such sensors put stringent requirements on processing and unmixing methods. This paper proposes an efficient implementation of a unsupervised linear unmixing method on GPUs using CUDA. The method finds the smallest simplex by solving a sequence of nonsmooth convex subproblems using variable splitting to obtain a constraint formulation, and then applying an augmented Lagrangian technique. The parallel implementation of SISAL presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory. The results herein presented indicate that the GPU implementation can significantly accelerate the method's execution over big datasets while maintaining the methods accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Biotecnologia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ionic Liquids (ILs) belong to a class of compounds with unusual properties: very low vapour pressure; high chemical and thermal stability and the ability to dissolve a wide range of substances. A new field in research is evaluating the possibility to use natural chiral biomolecules for the preparation of chiral ionic liquids (CILs). This important challenge in synthetic chemistry can open new avenues of research in order to avoid some problems related with the intrinsic biodegradability and toxicity associated to conventional ILs. The research work developed aimed for the synthesis of CILs, their characterization and possible applications, based on biological moieties used either as chiral cations or anions, depending on the synthetic manipulation of the derivatives. Overall, a total of 28 organic salts, including CILs were synthesized: 9 based on L-cysteine derivatives, 12 based on L-proline, 3 based on nucleosides and 4 based on nucleotides. All these new CILs were completely characterized and their chemical and physical properties were evaluated. Some CILs based on L-cysteine have been applied for discrimination processes, including resolution of racemates and as a chiral catalyst for asymmetric Aldol condensation. L-proline derived CILs were also studied as chiral catalysts for Michael reaction. In parallel, the interactions of macrocyclic oligosugars called cyclodextrins (CDs) with several ILs were studied. It was possible to improve the solubility of CDs in water and serum. Additionally, fatty acids and steroids showed an increase in water solubility when ILs-CDs systems were used. The development of efficient and selective ILs-CDs systems is indispensable to expand the range of their applications in host-guest interactions, drug delivery systems or catalytic reactions. Novel salts derived from nucleobases were used in order to enhance the fluorescence in aqueous solution. Additionally, preliminary studies regarding ethyl lactate as an alternative solvent for asymmetric organocatalysis were performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After ischemic stroke, the ischemic damage to brain tissue evolves over time and with an uneven spatial distribution. Early irreversible changes occur in the ischemic core, whereas, in the penumbra, which receives more collateral blood flow, the damage is more mild and delayed. A better characterization of the penumbra, irreversibly damaged and healthy tissues is needed to understand the mechanisms involved in tissue death. MRSI is a powerful tool for this task if the scan time can be decreased whilst maintaining high sensitivity. Therefore, we made improvements to a (1) H MRSI protocol to study middle cerebral artery occlusion in mice. The spatial distribution of changes in the neurochemical profile was investigated, with an effective spatial resolution of 1.4 μL, applying the protocol on a 14.1-T magnet. The acquired maps included the difficult-to-separate glutamate and glutamine resonances and, to our knowledge, the first mapping of metabolites γ-aminobutyric acid and glutathione in vivo, within a metabolite measurement time of 45 min. The maps were in excellent agreement with findings from single-voxel spectroscopy and offer spatial information at a scan time acceptable for most animal models. The metabolites measured differed with respect to the temporal evolution of their concentrations and the localization of these changes. Specifically, lactate and N-acetylaspartate concentration changes largely overlapped with the T(2) -hyperintense region visualized with MRI, whereas changes in cholines and glutathione affected the entire middle cerebral artery territory. Glutamine maps showed elevated levels in the ischemic striatum until 8 h after reperfusion, and until 24 h in cortical tissue, indicating differences in excitotoxic effects and secondary energy failure in these tissue types. Copyright © 2011 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: The purposes of this study were to (1) develop a high-resolution 3-T magnetic resonance angiography (MRA) technique with an in-plane resolution approximate to that of multidetector coronary computed tomography (MDCT) and a voxel size of 0.35 × 0.35 × 1.5 mm³ and to (2) investigate the image quality of this technique in healthy participants and preliminarily in patients with known coronary artery disease (CAD). MATERIALS AND METHODS: A 3-T coronary MRA technique optimized for an image acquisition voxel as small as 0.35 × 0.35 × 1.5 mm³ (high-resolution coronary MRA [HRC]) was implemented and the coronary arteries of 22 participants were imaged. These included 11 healthy participants (average age, 28.5 years; 5 men) and 11 participants with CAD (average age, 52.9 years; 5 women) as identified on MDCT. In addition, the 11 healthy participants were imaged using a method with a more common spatial resolution of 0.7 × 1 × 3 mm³ (regular-resolution coronary MRA [RRC]). Qualitative and quantitative comparisons were made between the 2 MRA techniques. RESULTS: Normal vessels and CAD lesions were successfully depicted at 350 × 350 μm² in-plane resolution with adequate signal-to-noise ratio (SNR) and contrast-to-noise ratio. The CAD findings were consistent among MDCT and HRC. The HRC showed a 47% improvement in sharpness despite a reduction in SNR (by 72%) and in contrast-to-noise ratio (by 86%) compared with the regular-resolution coronary MRA. CONCLUSION: This study, as a first step toward substantial improvement in the resolution of coronary MRA, demonstrates the feasibility of obtaining at 3 T a spatial resolution that approximates that of MDCT. The acquisition in-plane pixel dimensions are as small as 350 × 350 μm² with a 1.5-mm slice thickness. Although SNR is lower, the images have improved sharpness, resulting in image quality that allows qualitative identification of disease sites on MRA consistent with MDCT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: Whether behavioural and emotional maladjustment is more prevalent in children with inflammatory bowel disease (IBD) than in healthy controls remains controversial. The aim of this study was to assess paediatric IBD patients for problems with emotional and behavioural adjustment and to examine associations with clinical and demographic variables. METHODS: Data from paediatric patients with IBD enrolled in the Swiss IBD Cohort Study and the results of both the parent-rated Strengths and Difficulties Questionnaire (SDQ) and the self-reported Child Depression Inventory (CDI) were analysed. Of the 148 registered patients, 126 had at least one questionnaire completed and were included. RESULTS: The mean age of 71 patients with Crohn's disease (44 males, 27 females) was 13.4 years, and 12.8 years for the 55 patients with ulcerative or indeterminate colitis. The mean duration of disease was 1.2 and 2.7 years, respectively. The total score of the SDQ was abnormal in 11.4% of cases compared to 10% in the normal population. Abnormal sub-scores were found in 20.2% of subjects for the domain of emotional problems and in 17.1% for problems with peers. The total CDI T score indicated a significantly lower prevalence of clinical depression in IBD patients than in normal youth. No correlation between the total SDQ scores or the CDI T scores and gender, type or duration of IBD, inflammatory markers or disease scores was found. CONCLUSIONS: The prevalence of problems with behavioural and emotional adjustment among Swiss paediatric IBD patients is low and comparable to that of the normal population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The demand for accurate forecasting of the effects of global warming on biodiversity is growing, but current methods for forecasting have limitations. in this article, we compare and discuss the different uses of four forecasting methods: (1) models that consider species individually, (2) niche-theory models that group species by habitat (more specifically, by environmental conditions under which a species can persist or does persist), (3) general circulation models and coupled ocean-atmosphere-biosphere models, and (4) specics-area curve models that consider all species or large aggregates of species. After outlining the different uses and limitations of these methods, we make eight primary suggestions for improving forecasts. We find that greater use of the fossil record and of modern genetic studies would improve forecasting methods. We note a Quaternary conundrum: While current empirical and theoretical ecological results suggest that many species could be at risk from global warming, during the recent ice ages surprisingly few species became extinct. The potential resolution of this conundrum gives insights into the requirements for more accurate and reliable forecasting. Our eight suggestions also point to constructive synergies in the solution to the different problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A high-resolution three-dimensional (3D) seismic reflection system for small-scale targets in lacustrine settings has been developed. Its main characteristics include navigation and shot-triggering software that fires the seismic source at regular distance intervals (max. error of 0.25 m) with real-time control on navigation using differential GPS (Global Positioning System). Receiver positions are accurately calculated (error < 0.20 m) with the aid of GPS antennas attached to the end of each of three 24-channel streamers. Two telescopic booms hold the streamers at a distance of 7.5 m from each other. With a receiver spacing of 2.5 m, the bin dimension is 1.25 m in inline and 3.75 m in crossline direction. To test the system, we conducted a 3D survey of about 1 km(2) in Lake Geneva, Switzerland, over a complex fault zone. A 5-m shot spacing resulted in a nominal fold of 6. A double-chamber bubble-cancelling 15/15 in(3) air gun (40-650 Hz) operated at 80 bars and 1 m depth gave a signal penetration of 300 m below water bottom and a best vertical resolution of 1.1 m. Processing followed a conventional scheme, but had to be adapted to the high sampling rates, and our unconventional navigation data needed conversion to industry standards. The high-quality data enabled us to construct maps of seismic horizons and fault surfaces in three dimensions. The system proves to be well adapted to investigate complex structures by providing non-aliased images of reflectors with dips up to 30 degrees.