55 resultados para Phase-space Methods
Resumo:
La tècnica de la microdiàlisis cerebral (MDC) és un instrument que proporciona informació rellevant en la monitorització del metabolisme cerebral en els pacients neurocrítics. El lactat i l’índex lactat-piruvat (ILP) són dos marcadors utilitzats per a la detecció de la hipòxia cerebral en pacients que han patit un traumatisme cranioencefàlic (TCE). Aquests dos marcadors poden estar anormalment elevats en circumstàncies que no cursen amb hipòxia tissular. Per una altra banda la recent aparició dels catèters de MDC amb porus de major mida denominats d’”alta resolució”, permet ampliar el rang de molècules que es poden detectar en el dialitzat. Objectius: 1) descriure les característiques del metabolisme energètic cerebral que s’observa en la fase aguda dels pacients que han patit un TCE en base als dos indicadors del metabolisme anaeròbic: lactat i ILP, i 2) determinar la recuperació relativa (RR) de les molècules implicades en la resposta neuroinflamatòria: de IL-1β, IL- 6, IL-8 i IL-10. Material i mètodes: Es van seleccionar 46 pacients d’una cohort de pacients amb TCE moderat o greu ingressats a la Unitat de Cures Intensives de l’Hospital Universitari de la Vall d’Hebron i monitoritzats amb MDC. Es van analitzar els nivells de lactat i ILP i es va correlacionar amb els nivells de PtiO2. Es van realitzar experiments in vitro per estudiar la recuperació de les membranes de 100 KDa per tal de poder interpretar posteriorment els nivells reals de les molècules estudiades en l’espai extracel•lular del teixit cerebral. Resultats: La concordança entre el lactat i l’índex LP per a determinar episodis de disfunció metabòlica va ser dèbil (índex de kappa = 0,36, IC 95%: 0,34-0,39). Més del 80% dels casos en què el lactat i l’índex LP es trobaven incrementats, els valors de la PtiO2 es van trobar dins els rangs de normalitat (PtiO2&15mmHg). La recuperació de les citoquines a través de la membrana de microdiàlisis va ser menor de l’esperat tenint en compte la mida dels porus de la membrana. Conclusions: el lactat i l’índex LP elevats va ser una troballa freqüent després d’un TCE i no es va relacionar, en la majoria de casos, amb episodis d’hipòxia tissular. Per un altra part la mida del porus de la membrana no és l’únic paràmetre indicador de la RR de macromolècules.
Resumo:
The objective of traffic engineering is to optimize network resource utilization. Although several works have been published about minimizing network resource utilization, few works have focused on LSR (label switched router) label space. This paper proposes an algorithm that takes advantage of the MPLS label stack features in order to reduce the number of labels used in LSPs. Some tunnelling methods and their MPLS implementation drawbacks are also discussed. The described algorithm sets up NHLFE (next hop label forwarding entry) tables in each LSR, creating asymmetric tunnels when possible. Experimental results show that the described algorithm achieves a great reduction factor in the label space. The presented works apply for both types of connections: P2MP (point-to-multipoint) and P2P (point-to-point)
Resumo:
The aim of traffic engineering is to optimise network resource utilization. Although several works on minimizing network resource utilization have been published, few works have focused on LSR label space. This paper proposes an algorithm that uses MPLS label stack features in order to reduce the number of labels used in LSPs forwarding. Some tunnelling methods and their MPLS implementation drawbacks are also discussed. The algorithm described sets up the NHLFE tables in each LSR, creating asymmetric tunnels when possible. Experimental results show that the algorithm achieves a large reduction factor in the label space. The work presented here applies for both types of connections: P2MP and P2P
Resumo:
Immobile location-allocation (LA) problems is a type of LA problem that consists in determining the service each facility should offer in order to optimize some criterion (like the global demand), given the positions of the facilities and the customers. Due to the complexity of the problem, i.e. it is a combinatorial problem (where is the number of possible services and the number of facilities) with a non-convex search space with several sub-optimums, traditional methods cannot be applied directly to optimize this problem. Thus we proposed the use of clustering analysis to convert the initial problem into several smaller sub-problems. By this way, we presented and analyzed the suitability of some clustering methods to partition the commented LA problem. Then we explored the use of some metaheuristic techniques such as genetic algorithms, simulated annealing or cuckoo search in order to solve the sub-problems after the clustering analysis
Resumo:
The space subdivision in cells resulting from a process of random nucleation and growth is a subject of interest in many scientific fields. In this paper, we deduce the expected value and variance of these distributions while assuming that the space subdivision process is in accordance with the premises of the Kolmogorov-Johnson-Mehl-Avrami model. We have not imposed restrictions on the time dependency of nucleation and growth rates. We have also developed an approximate analytical cell size probability density function. Finally, we have applied our approach to the distributions resulting from solid phase crystallization under isochronal heating conditions
Resumo:
This paper presents and estimates a dynamic choice model in the attribute space considering rational consumers. In light of the evidence of several state-dependence patterns, the standard attribute-based model is extended by considering a general utility function where pure inertia and pure variety-seeking behaviors can be explained in the model as particular linear cases. The dynamics of the model are fully characterized by standard dynamic programming techniques. The model presents a stationary consumption pattern that can be inertial, where the consumer only buys one product, or a variety-seeking one, where the consumer shifts among varied products.We run some simulations to analyze the consumption paths out of the steady state. Underthe hybrid utility assumption, the consumer behaves inertially among the unfamiliar brandsfor several periods, eventually switching to a variety-seeking behavior when the stationary levels are approached. An empirical analysis is run using scanner databases for three different product categories: fabric softener, saltine cracker, and catsup. Non-linear specifications provide the best fit of the data, as hybrid functional forms are found in all the product categories for most attributes and segments. These results reveal the statistical superiority of the non-linear structure and confirm the gradual trend to seek variety as the level of familiarity with the purchased items increases.
Resumo:
Earthquakes represent a major hazard for populations around the world, causing frequent loss of life,human suffering and enormous damage to homes, other buildings and infrastructure. The Technology Resources forEarthquake Monitoring and Response (TREMOR) Team of 36 space professionals analysed this problem over thecourse of the International Space University Summer Session Program and published their recommendations in the formof a report. The TREMOR Team proposes a series of space- and ground-based systems to provide improved capabilityto manage earthquakes. The first proposed system is a prototype earthquake early-warning system that improves theexisting knowledge of earthquake precursors and addresses the potential of these phenomena. Thus, the system willat first enable the definitive assessment of whether reliable earthquake early warning is possible through precursormonitoring. Should the answer be affirmative, the system itself would then form the basis of an operational earlywarningsystem. To achieve these goals, the authors propose a multi-variable approach in which the system will combine,integrate and process precursor data from space- and ground-based seismic monitoring systems (already existing andnew proposed systems) and data from a variety of related sources (e.g. historical databases, space weather data, faultmaps). The second proposed system, the prototype earthquake simulation and response system, coordinates the maincomponents of the response phase to reduce the time delays of response operations, increase the level of precisionin the data collected, facilitate communication amongst teams, enhance rescue and aid capabilities and so forth. It isbased in part on an earthquake simulator that will provide pre-event (if early warning is proven feasible) and post-eventdamage assessment and detailed data of the affected areas to corresponding disaster management actors by means of ageographic information system (GIS) interface. This is coupled with proposed mobile satellite communication hubs toprovide links between response teams. Business- and policy-based implementation strategies for these proposals, suchas the establishment of a non-governmental organisation to develop and operate the systems, are included.
Resumo:
In this paper we present a Bayesian image reconstruction algorithm with entropy prior (FMAPE) that uses a space-variant hyperparameter. The spatial variation of the hyperparameter allows different degrees of resolution in areas of different statistical characteristics, thus avoiding the large residuals resulting from algorithms that use a constant hyperparameter. In the first implementation of the algorithm, we begin by segmenting a Maximum Likelihood Estimator (MLE) reconstruction. The segmentation method is based on using a wavelet decomposition and a self-organizing neural network. The result is a predetermined number of extended regions plus a small region for each star or bright object. To assign a different value of the hyperparameter to each extended region and star, we use either feasibility tests or cross-validation methods. Once the set of hyperparameters is obtained, we carried out the final Bayesian reconstruction, leading to a reconstruction with decreased bias and excellent visual characteristics. The method has been applied to data from the non-refurbished Hubble Space Telescope. The method can be also applied to ground-based images.
Resumo:
El déficit existente a nuestro país con respecto a la disponibilidad de indicadores cuantitativos con los que llevar a término un análisis coyuntural de la actividad industrial regional ha abierto un debate centrado en el estudio de cuál es la metodología más adecuada para elaborar indicadores de estas características. Dentro de este marco, en este trabajo se presentan las principales conclusiones obtenidas en anteriores estudios (Clar, et. al., 1997a, 1997b y 1998) sobre la idoneidad de extender las metodologías que actualmente se están aplicando a las regiones españolas para elaborar indicadores de la actividad industrial mediante métodos indirectos. Estas conclusiones llevan a plantear una estrategia distinta a las que actualmente se vienen aplicando. En concreto, se propone (siguiendo a Israilevich y Kuttner, 1993) un modelo de variables latentes para estimar el indicador de la producción industrial regional. Este tipo de modelo puede especificarse en términos de un modelo statespace y estimarse mediante el filtro de Kalman. Para validar la metodología propuesta se estiman unos indicadores de acuerdo con ella para tres de las cuatro regiones españolas que disponen d¿un Índice de Producción Industrial (IPI) elaborado mediante el método directo (Andalucía, Asturias y el País Vasco) y se comparan con los IPIs publicados (oficiales). Los resultados obtenidos muestran el buen comportamiento de l¿estrategia propuesta, abriendo así una línea de trabajo con la que subsanar el déficit al que se hacía referencia anteriormente
Resumo:
El déficit existente a nuestro país con respecto a la disponibilidad de indicadores cuantitativos con los que llevar a término un análisis coyuntural de la actividad industrial regional ha abierto un debate centrado en el estudio de cuál es la metodología más adecuada para elaborar indicadores de estas características. Dentro de este marco, en este trabajo se presentan las principales conclusiones obtenidas en anteriores estudios (Clar, et. al., 1997a, 1997b y 1998) sobre la idoneidad de extender las metodologías que actualmente se están aplicando a las regiones españolas para elaborar indicadores de la actividad industrial mediante métodos indirectos. Estas conclusiones llevan a plantear una estrategia distinta a las que actualmente se vienen aplicando. En concreto, se propone (siguiendo a Israilevich y Kuttner, 1993) un modelo de variables latentes para estimar el indicador de la producción industrial regional. Este tipo de modelo puede especificarse en términos de un modelo statespace y estimarse mediante el filtro de Kalman. Para validar la metodología propuesta se estiman unos indicadores de acuerdo con ella para tres de las cuatro regiones españolas que disponen d¿un Índice de Producción Industrial (IPI) elaborado mediante el método directo (Andalucía, Asturias y el País Vasco) y se comparan con los IPIs publicados (oficiales). Los resultados obtenidos muestran el buen comportamiento de l¿estrategia propuesta, abriendo así una línea de trabajo con la que subsanar el déficit al que se hacía referencia anteriormente
Resumo:
The Gross-Neveu model in an S^1 space is analyzed by means of a variational technique: the Gaussian effective potential. By making the proper connection with previous exact results at finite temperature, we show that this technique is able to describe the phase transition occurring in this model. We also make some remarks about the appropriate treatment of Grassmann variables in variational approaches.
Resumo:
We explore the phase diagram of a two-component ultracold atomic Fermi gas interacting with zero-range forces in the limit of weak coupling. We focus on the dependence of the pairing gap and the free energy on the variations in the number densities of the two species while the total density of the system is held fixed. As the density asymmetry is increased, the system exhibits a transition from a homogenous Bardeen-Cooper-Schrieffer (BCS) phase to phases with spontaneously broken global space symmetries. One such realization is the deformed Fermi surface superfluidity (DFS) which exploits the possibility of deforming the Fermi surfaces of the species into ellipsoidal form at zero total momentum of Cooper pairs. The critical asymmetries at which the transition from DFS to the unpaired state occurs are larger than those for the BCS phase. In this precritical region the DFS phase lowers the pairing energy of the asymmetric BCS state. We compare quantitatively the DFS phase to another realization of superconducting phases with broken translational symmetry: the single-plane-wave Larkin-Ovchinnikov-Fulde-Ferrell phase, which is characterized by a nonvanishing center-of-mass momentum of the Cooper pairs. The possibility of the detection of the DFS phase in the time-of-flight experiments is discussed and quantified for the case of 6Li atoms trapped in two different hyperfine states.
Resumo:
We have systematically analyzed six different reticular models with quenched disorder and no thermal fluctuations exhibiting a field-driven first-order phase transition. We have studied the nonequilibrium transition, appearing when varying the amount of disorder, characterized by the change from a discontinuous hysteresis cycle (with one or more large avalanches) to a smooth one (with only tiny avalanches). We have computed critical exponents using finite size scaling techniques and shown that they are consistent with universal values depending only on the space dimensionality d.
Resumo:
The use of different kinds of nonlinear filtering in a joint transform correlator are studied and compared. The study is divided into two parts, one corresponding to object space and the second to the Fourier domain of the joint power spectrum. In the first part, phase and inverse filters are computed; their inverse Fourier transforms are also computed, thereby becoming the reference in the object space. In the Fourier space, the binarization of the power spectrum is realized and compared with a new procedure for removing the spatial envelope. All cases are simulated and experimentally implemented by a compact joint transform correlator.
Resumo:
Nucleation rates for tunneling processes in Minkowski and de Sitter space are investigated, taking into account one loop prefactors. In particular, we consider the creation of membranes by an antisymmetric tensor field, analogous to Schwinger pair production. This can be viewed as a model for the decay of a false (or true) vacuum at zero temperature in the thin wall limit. Also considered is the spontaneous nucleation of strings, domain walls, and monopoles during inflation. The instantons for these processes are spherical world sheets or world lines embedded in flat or de Sitter backgrounds. We find the contribution of such instantons to the semiclassical partition function, including the one loop corrections due to small fluctuations around the spherical world sheet. We suggest a prescription for obtaining, from the partition function, the distribution of objects nucleated during inflation. This can be seen as an extension of the usual formula, valid in flat space, according to which the nucleation rate is twice the imaginary part of the free energy. For the case of pair production, the results reproduce those that can be obtained using second quantization methods, confirming the validity of instanton techniques in de Sitter space. Throughout the paper, both the gravitational field and the antisymmetric tensor field are assumed external.