868 resultados para localized algorithms
Resumo:
In this work the differentiability of the principal eigenvalue lambda = lambda(1)(Gamma) to the localized Steklov problem -Delta u + qu = 0 in Omega, partial derivative u/partial derivative nu = lambda chi(Gamma)(x)u on partial derivative Omega, where Gamma subset of partial derivative Omega is a smooth subdomain of partial derivative Omega and chi(Gamma) is its characteristic function relative to partial derivative Omega, is shown. As a key point, the flux subdomain Gamma is regarded here as the variable with respect to which such differentiation is performed. An explicit formula for the derivative of lambda(1) (Gamma) with respect to Gamma is obtained. The lack of regularity up to the boundary of the first derivative of the principal eigenfunctions is a further intrinsic feature of the problem. Therefore, the whole analysis must be done in the weak sense of H(1)(Omega). The study is of interest in mathematical models in morphogenesis. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
BACKGROUND: Scleroderma is a chronic autoimmune disease characterized by progressive connective tissue sclerosis and microcirculatory changes. Localized scleroderma is considered a limited disease. However, in some cases atrophic and deforming lesions may be observed that hinder the normal development. Literature reports indicate phototherapy as a therapeutic modality with favorable response in cutaneous forms of scleroderma. OBJECTIVES: This study had the purpose of assessing the phototherapy treatment for localized scleroderma. METHODS: Patients with localized scleroderma were selected for phototherapy treatment. They were classified according to the type of localized scleroderma and evolutive stage of the lesions. Clinical examination and skin ultrasound were used to demonstrate the results thus obtained. RESULTS: Some clinical improvement was observed after an average of 10 phototherapeutic sessions. All skin lesions were softer at clinical palpation with scores reduction upon pre and post treatment comparison. The ultrasound showed that most of the assessed lesions presented a decrease in dermal thickness, and only five maintained their previous measure. Treatment response was similar regardless of the type of phototherapeutic treatment employed. CONCLUSIONS: The proposed treatment was effective for all lesions, regardless of the phototherapeutic modality employed. The improvement was observed in all treated skin lesions and confirmed by clinical evaluation and skin ultrasound.
Resumo:
This paper addresses the analysis of probabilistic corrosion time initiation in reinforced concrete structures exposed to ions chloride penetration. Structural durability is an important criterion which must be evaluated in every type of structure, especially when these structures are constructed in aggressive atmospheres. Considering reinforced concrete members, chloride diffusion process is widely used to evaluate the durability. Therefore, at modelling this phenomenon, corrosion of reinforcements can be better estimated and prevented. These processes begin when a threshold level of chlorides concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in the literature, deterministic approaches fail to predict accurately the corrosion time initiation due to the inherently randomness observed in this process. In this regard, the durability can be more realistically represented using probabilistic approaches. A probabilistic analysis of ions chloride penetration is presented in this paper. The ions chloride penetration is simulated using the Fick's second law of diffusion. This law represents the chloride diffusion process, considering time dependent effects. The probability of failure is calculated using Monte Carlo simulation and the First Order Reliability Method (FORM) with a direct coupling approach. Some examples are considered in order to study these phenomena and a simplified method is proposed to determine optimal values for concrete cover.
Resumo:
The spectral reflectance of the sea surface recorded using ocean colour satellite sensors has been used to estimate chlorophyll-a concentrations for decades. However, in bio-optically complex coastal waters, these estimates are compromised by the presence of several other coloured components besides chlorophyll, especially in regions affected by low-salinity waters. The present work aims to (a) describe the influence of the freshwater plume from the La Plata River on the variability of in situ remote sensing reflectance and (b) evaluate the performance of operational ocean colour chlorophyll algorithms applied to Southwestern Atlantic waters, which receive a remarkable seasonal contribution from La Plata River discharges. Data from three oceanographic cruises are used, in addition to a historical regional bio-optical dataset. Deviations found between measured and estimated concentrations of chlorophyll-a are examined in relation to surface water salinity and turbidity gradients to investigate the source of errors in satellite estimates of pigment concentrations. We observed significant seasonal variability in surface reflectance properties that are strongly driven by La Plata River plume dynamics and arise from the presence of high levels of inorganic suspended solids and coloured dissolved materials. As expected, existing operational algorithms overestimate the concentration of chlorophyll-a, especially in waters of low salinity (S<33.5) and high turbidity (Rrs(670)>0.0012 sr−1). Additionally, an updated version of the regional algorithm is presented, which clearly improves the chlorophyll estimation in those types of coastal environment. In general, the techniques presented here allow us to directly distinguish the bio-optical types of waters to be considered in algorithm studies by the ocean colour community.
Resumo:
Parallel kinematic structures are considered very adequate architectures for positioning and orienti ng the tools of robotic mechanisms. However, developing dynamic models for this kind of systems is sometimes a difficult task. In fact, the direct application of traditional methods of robotics, for modelling and analysing such systems, usually does not lead to efficient and systematic algorithms. This work addre sses this issue: to present a modular approach to generate the dynamic model and through some convenient modifications, how we can make these methods more applicable to parallel structures as well. Kane’s formulati on to obtain the dynamic equations is shown to be one of the easiest ways to deal with redundant coordinates and kinematic constraints, so that a suitable c hoice of a set of coordinates allows the remaining of the modelling procedure to be computer aided. The advantages of this approach are discussed in the modelling of a 3-dof parallel asymmetric mechanisms.
Resumo:
In molecular and atomic devices the interaction between electrons and ionic vibrations has an important role in electronic transport. The electron-phonon coupling can cause the loss of the electron's phase coherence, the opening of new conductance channels and the suppression of purely elastic ones. From the technological viewpoint phonons might restrict the efficiency of electronic devices by energy dissipation, causing heating, power loss and instability. The state of the art in electron transport calculations consists in combining ab initio calculations via Density Functional Theory (DFT) with Non-Equilibrium Green's Function formalism (NEGF). In order to include electron-phonon interactions, one needs in principle to include a self-energy scattering term in the open system Hamiltonian which takes into account the effect of the phonons over the electrons and vice versa. Nevertheless this term could be obtained approximately by perturbative methods. In the First Born Approximation one considers only the first order terms of the electronic Green's function expansion. In the Self-Consistent Born Approximation, the interaction self-energy is calculated with the perturbed electronic Green's function in a self-consistent way. In this work we describe how to incorporate the electron-phonon interaction to the SMEAGOL program (Spin and Molecular Electronics in Atomically Generated Orbital Landscapes), an ab initio code for electronic transport based on the combination of DFT + NEGF. This provides a tool for calculating the transport properties of materials' specific system, particularly in molecular electronics. Preliminary results will be presented, showing the effects produced by considering the electron-phonon interaction in nanoscale devices.
Resumo:
Eliminadas las páginas en blanco
Resumo:
[EN] Background: Cervical cancer is treated mainly by surgery and radiotherapy. Toxicity due to radiation is a limiting factor for treatment success. Determination of lymphocyte radiosensitivity by radio-induced apoptosis arises as a possible method for predictive test development. The aim of this study was to analyze radio-induced apoptosis of peripheral blood lymphocytes. Methods: Ninety four consecutive patients suffering from cervical carcinoma, diagnosed and treated in our institution, and four healthy controls were included in the study. Toxicity was evaluated using the Lent-Soma scale. Peripheral blood lymphocytes were isolated and irradiated at 0, 1, 2 and 8 Gy during 24, 48 and 72 hours. Apoptosis was measured by flow cytometry using annexin V/propidium iodide to determine early and late apoptosis. Lymphocytes were marked with CD45 APC-conjugated monoclonal antibody. Results: Radiation-induced apoptosis (RIA) increased with radiation dose and time of incubation. Data strongly fitted to a semi logarithmic model as follows: RIA = βln(Gy) + α. This mathematical model was defined by two constants: α, is the origin of the curve in the Y axis and determines the percentage of spontaneous cell death and β, is the slope of the curve and determines the percentage of cell death induced at a determined radiation dose (β = ΔRIA/Δln(Gy)). Higher β values (increased rate of RIA at given radiation doses) were observed in patients with low sexual toxicity (Exp(B) = 0.83, C.I. 95% (0.73-0.95), p = 0.007; Exp(B) = 0.88, C.I. 95% (0.82-0.94), p = 0.001; Exp(B) = 0.93, C.I. 95% (0.88-0.99), p = 0.026 for 24, 48 and 72 hours respectively). This relation was also found with rectal (Exp(B) = 0.89, C.I. 95% (0.81-0.98), p = 0.026; Exp(B) = 0.95, C.I. 95% (0.91-0.98), p = 0.013 for 48 and 72 hours respectively) and urinary (Exp(B) = 0.83, C.I. 95% (0.71-0.97), p = 0.021 for 24 hours) toxicity. Conclusion: Radiation induced apoptosis at different time points and radiation doses fitted to a semi logarithmic model defined by a mathematical equation that gives an individual value of radiosensitivity and could predict late toxicity due to radiotherapy. Other prospective studies with higher number of patients are needed to validate these results.
Resumo:
This work is structured as follows: In Section 1 we discuss the clinical problem of heart failure. In particular, we present the phenomenon known as ventricular mechanical dyssynchrony: its impact on cardiac function, the therapy for its treatment and the methods for its quantification. Specifically, we describe the conductance catheter and its use for the measurement of dyssynchrony. At the end of the Section 1, we propose a new set of indexes to quantify the dyssynchrony that are studied and validated thereafter. In Section 2 we describe the studies carried out in this work: we report the experimental protocols, we present and discuss the results obtained. Finally, we report the overall conclusions drawn from this work and we try to envisage future works and possible clinical applications of our results. Ancillary studies that were carried out during this work mainly to investigate several aspects of cardiac resynchronization therapy (CRT) are mentioned in Appendix. -------- Ventricular mechanical dyssynchrony plays a regulating role already in normal physiology but is especially important in pathological conditions, such as hypertrophy, ischemia, infarction, or heart failure (Chapter 1,2.). Several prospective randomized controlled trials supported the clinical efficacy and safety of cardiac resynchronization therapy (CRT) in patients with moderate or severe heart failure and ventricular dyssynchrony. CRT resynchronizes ventricular contraction by simultaneous pacing of both left and right ventricle (biventricular pacing) (Chapter 1.). Currently, the conductance catheter method has been used extensively to assess global systolic and diastolic ventricular function and, more recently, the ability of this instrument to pick-up multiple segmental volume signals has been used to quantify mechanical ventricular dyssynchrony. Specifically, novel indexes based on volume signals acquired with the conductance catheter were introduced to quantify dyssynchrony (Chapter 3,4.). Present work was aimed to describe the characteristics of the conductancevolume signals, to investigate the performance of the indexes of ventricular dyssynchrony described in literature and to introduce and validate improved dyssynchrony indexes. Morevoer, using the conductance catheter method and the new indexes, the clinical problem of the ventricular pacing site optimization was addressed and the measurement protocol to adopt for hemodynamic tests on cardiac pacing was investigated. In accordance to the aims of the work, in addition to the classical time-domain parameters, a new set of indexes has been extracted, based on coherent averaging procedure and on spectral and cross-spectral analysis (Chapter 4.). Our analyses were carried out on patients with indications for electrophysiologic study or device implantation (Chapter 5.). For the first time, besides patients with heart failure, indexes of mechanical dyssynchrony based on conductance catheter were extracted and studied in a population of patients with preserved ventricular function, providing information on the normal range of such a kind of values. By performing a frequency domain analysis and by applying an optimized coherent averaging procedure (Chapter 6.a.), we were able to describe some characteristics of the conductance-volume signals (Chapter 6.b.). We unmasked the presence of considerable beat-to-beat variations in dyssynchrony that seemed more frequent in patients with ventricular dysfunction and to play a role in discriminating patients. These non-recurrent mechanical ventricular non-uniformities are probably the expression of the substantial beat-to-beat hemodynamic variations, often associated with heart failure and due to cardiopulmonary interaction and conduction disturbances. We investigated how the coherent averaging procedure may affect or refine the conductance based indexes; in addition, we proposed and tested a new set of indexes which quantify the non-periodic components of the volume signals. Using the new set of indexes we studied the acute effects of the CRT and the right ventricular pacing, in patients with heart failure and patients with preserved ventricular function. In the overall population we observed a correlation between the hemodynamic changes induced by the pacing and the indexes of dyssynchrony, and this may have practical implications for hemodynamic-guided device implantation. The optimal ventricular pacing site for patients with conventional indications for pacing remains controversial. The majority of them do not meet current clinical indications for CRT pacing. Thus, we carried out an analysis to compare the impact of several ventricular pacing sites on global and regional ventricular function and dyssynchrony (Chapter 6.c.). We observed that right ventricular pacing worsens cardiac function in patients with and without ventricular dysfunction unless the pacing site is optimized. CRT preserves left ventricular function in patients with normal ejection fraction and improves function in patients with poor ejection fraction despite no clinical indication for CRT. Moreover, the analysis of the results obtained using new indexes of regional dyssynchrony, suggests that pacing site may influence overall global ventricular function depending on its relative effects on regional function and synchrony. Another clinical problem that has been investigated in this work is the optimal right ventricular lead location for CRT (Chapter 6.d.). Similarly to the previous analysis, using novel parameters describing local synchrony and efficiency, we tested the hypothesis and we demonstrated that biventricular pacing with alternative right ventricular pacing sites produces acute improvement of ventricular systolic function and improves mechanical synchrony when compared to standard right ventricular pacing. Although no specific right ventricular location was shown to be superior during CRT, the right ventricular pacing site that produced the optimal acute hemodynamic response varied between patients. Acute hemodynamic effects of cardiac pacing are conventionally evaluated after stabilization episodes. The applied duration of stabilization periods in most cardiac pacing studies varied considerably. With an ad hoc protocol (Chapter 6.e.) and indexes of mechanical dyssynchrony derived by conductance catheter we demonstrated that the usage of stabilization periods during evaluation of cardiac pacing may mask early changes in systolic and diastolic intra-ventricular dyssynchrony. In fact, at the onset of ventricular pacing, the main dyssynchrony and ventricular performance changes occur within a 10s time span, initiated by the changes in ventricular mechanical dyssynchrony induced by aberrant conduction and followed by a partial or even complete recovery. It was already demonstrated in normal animals that ventricular mechanical dyssynchrony may act as a physiologic modulator of cardiac performance together with heart rate, contractile state, preload and afterload. The present observation, which shows the compensatory mechanism of mechanical dyssynchrony, suggests that ventricular dyssynchrony may be regarded as an intrinsic cardiac property, with baseline dyssynchrony at increased level in heart failure patients. To make available an independent system for cardiac output estimation, in order to confirm the results obtained with conductance volume method, we developed and validated a novel technique to apply the Modelflow method (a method that derives an aortic flow waveform from arterial pressure by simulation of a non-linear three-element aortic input impedance model, Wesseling et al. 1993) to the left ventricular pressure signal, instead of the arterial pressure used in the classical approach (Chapter 7.). The results confirmed that in patients without valve abnormalities, undergoing conductance catheter evaluations, the continuous monitoring of cardiac output using the intra-ventricular pressure signal is reliable. Thus, cardiac output can be monitored quantitatively and continuously with a simple and low-cost method. During this work, additional studies were carried out to investigate several areas of uncertainty of CRT. The results of these studies are briefly presented in Appendix: the long-term survival in patients treated with CRT in clinical practice, the effects of CRT in patients with mild symptoms of heart failure and in very old patients, the limited thoracotomy as a second choice alternative to transvenous implant for CRT delivery, the evolution and prognostic significance of diastolic filling pattern in CRT, the selection of candidates to CRT with echocardiographic criteria and the prediction of response to the therapy.
Resumo:
Large scale wireless adhoc networks of computers, sensors, PDAs etc. (i.e. nodes) are revolutionizing connectivity and leading to a paradigm shift from centralized systems to highly distributed and dynamic environments. An example of adhoc networks are sensor networks, which are usually composed by small units able to sense and transmit to a sink elementary data which are successively processed by an external machine. Recent improvements in the memory and computational power of sensors, together with the reduction of energy consumptions, are rapidly changing the potential of such systems, moving the attention towards datacentric sensor networks. A plethora of routing and data management algorithms have been proposed for the network path discovery ranging from broadcasting/floodingbased approaches to those using global positioning systems (GPS). We studied WGrid, a novel decentralized infrastructure that organizes wireless devices in an adhoc manner, where each node has one or more virtual coordinates through which both message routing and data management occur without reliance on either flooding/broadcasting operations or GPS. The resulting adhoc network does not suffer from the deadend problem, which happens in geographicbased routing when a node is unable to locate a neighbor closer to the destination than itself. WGrid allow multidimensional data management capability since nodes' virtual coordinates can act as a distributed database without needing neither special implementation or reorganization. Any kind of data (both single and multidimensional) can be distributed, stored and managed. We will show how a location service can be easily implemented so that any search is reduced to a simple query, like for any other data type. WGrid has then been extended by adopting a replication methodology. We called the resulting algorithm WRGrid. Just like WGrid, WRGrid acts as a distributed database without needing neither special implementation nor reorganization and any kind of data can be distributed, stored and managed. We have evaluated the benefits of replication on data management, finding out, from experimental results, that it can halve the average number of hops in the network. The direct consequence of this fact are a significant improvement on energy consumption and a workload balancing among sensors (number of messages routed by each node). Finally, thanks to the replications, whose number can be arbitrarily chosen, the resulting sensor network can face sensors disconnections/connections, due to failures of sensors, without data loss. Another extension to {WGrid} is {W*Grid} which extends it by strongly improving network recovery performance from link and/or device failures that may happen due to crashes or battery exhaustion of devices or to temporary obstacles. W*Grid guarantees, by construction, at least two disjoint paths between each couple of nodes. This implies that the recovery in W*Grid occurs without broadcasting transmissions and guaranteeing robustness while drastically reducing the energy consumption. An extensive number of simulations shows the efficiency, robustness and traffic road of resulting networks under several scenarios of device density and of number of coordinates. Performance analysis have been compared to existent algorithms in order to validate the results.
Resumo:
The inherent stochastic character of most of the physical quantities involved in engineering models has led to an always increasing interest for probabilistic analysis. Many approaches to stochastic analysis have been proposed. However, it is widely acknowledged that the only universal method available to solve accurately any kind of stochastic mechanics problem is Monte Carlo Simulation. One of the key parts in the implementation of this technique is the accurate and efficient generation of samples of the random processes and fields involved in the problem at hand. In the present thesis an original method for the simulation of homogeneous, multi-dimensional, multi-variate, non-Gaussian random fields is proposed. The algorithm has proved to be very accurate in matching both the target spectrum and the marginal probability. The computational efficiency and robustness are very good too, even when dealing with strongly non-Gaussian distributions. What is more, the resulting samples posses all the relevant, welldefined and desired properties of “translation fields”, including crossing rates and distributions of extremes. The topic of the second part of the thesis lies in the field of non-destructive parametric structural identification. Its objective is to evaluate the mechanical characteristics of constituent bars in existing truss structures, using static loads and strain measurements. In the cases of missing data and of damages that interest only a small portion of the bar, Genetic Algorithm have proved to be an effective tool to solve the problem.