1000 resultados para Mètodes de simulació
Resumo:
By means of computer simulations and solution of the equations of the mode coupling theory (MCT),we investigate the role of the intramolecular barriers on several dynamic aspects of nonentangled polymers. The investigated dynamic range extends from the caging regime characteristic of glass-formers to the relaxation of the chain Rouse modes. We review our recent work on this question,provide new results, and critically discuss the limitations of the theory. Solutions of the MCT for the structural relaxation reproduce qualitative trends of simulations for weak and moderate barriers. However, a progressive discrepancy is revealed as the limit of stiff chains is approached. This dis-agreement does not seem related with dynamic heterogeneities, which indeed are not enhanced by increasing barrier strength. It is not connected either with the breakdown of the convolution approximation for three-point static correlations, which retains its validity for stiff chains. These findings suggest the need of an improvement of the MCT equations for polymer melts. Concerning the relaxation of the chain degrees of freedom, MCT provides a microscopic basis for time scales from chain reorientation down to the caging regime. It rationalizes, from first principles, the observed deviations from the Rouse model on increasing the barrier strength. These include anomalous scaling of relaxation times, long-time plateaux, and nonmonotonous wavelength dependence of the mode correlators.
Resumo:
Background In an agreement assay, it is of interest to evaluate the degree of agreement between the different methods (devices, instruments or observers) used to measure the same characteristic. We propose in this study a technical simplification for inference about the total deviation index (TDI) estimate to assess agreement between two devices of normally-distributed measurements and describe its utility to evaluate inter- and intra-rater agreement if more than one reading per subject is available for each device. Methods We propose to estimate the TDI by constructing a probability interval of the difference in paired measurements between devices, and thereafter, we derive a tolerance interval (TI) procedure as a natural way to make inferences about probability limit estimates. We also describe how the proposed method can be used to compute bounds of the coverage probability. Results The approach is illustrated in a real case example where the agreement between two instruments, a handle mercury sphygmomanometer device and an OMRON 711 automatic device, is assessed in a sample of 384 subjects where measures of systolic blood pressure were taken twice by each device. A simulation study procedure is implemented to evaluate and compare the accuracy of the approach to two already established methods, showing that the TI approximation produces accurate empirical confidence levels which are reasonably close to the nominal confidence level. Conclusions The method proposed is straightforward since the TDI estimate is derived directly from a probability interval of a normally-distributed variable in its original scale, without further transformations. Thereafter, a natural way of making inferences about this estimate is to derive the appropriate TI. Constructions of TI based on normal populations are implemented in most standard statistical packages, thus making it simpler for any practitioner to implement our proposal to assess agreement.
Resumo:
We present computer simulations of a simple bead-spring model for polymer melts with intramolecular barriers. By systematically tuning the strength of the barriers, we investigate their role on the glass transition. Dynamic observables are analyzed within the framework of the mode coupling theory (MCT). Critical nonergodicity parameters, critical temperatures, and dynamic exponents are obtained from consistent fits of simulation data to MCT asymptotic laws. The so-obtained MCT λ-exponent increases from standard values for fully flexible chains to values close to the upper limit for stiff chains. In analogy with systems exhibiting higher-order MCT transitions, we suggest that the observed large λ-values arise form the interplay between two distinct mechanisms for dynamic arrest: general packing effects and polymer-specific intramolecular barriers. We compare simulation results with numerical solutions of the MCT equations for polymer systems, within the polymer reference interaction site model (PRISM) for static correlations. We verify that the approximations introduced by the PRISM are fulfilled by simulations, with the same quality for all the range of investigated barrier strength. The numerical solutions reproduce the qualitative trends of simulations for the dependence of the nonergodicity parameters and critical temperatures on the barrier strength. In particular, the increase in the barrier strength at fixed density increases the localization length and the critical temperature. However the qualitative agreement between theory and simulation breaks in the limit of stiff chains. We discuss the possible origin of this feature.
Resumo:
The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.
Resumo:
Una vez se dispone de los datos introducidos en el paquete estadístico del SPSS (Statistical Package of Social Science), en una matriz de datos, es el momento de plantearse optimizar esa matriz para poder extraer el máximo rendimiento a los datos, según el tipo de análisis que se pretende realizar. Para ello, el propio SPSS tiene una serie de utilidades que pueden ser de gran utilidad. Estas utilidades básicas pueden diferenciarse según su funcionalidad entre: utilidades para la edición de datos, utilidades para la modificación de variables, y las opciones de ayuda que nos brinda. A continuación se presentan algunas de estas utilidades.
Resumo:
El análisis discriminante es un método estadístico a través del cual se busca conocer qué variables, medidas en objetos o individuos, explican mejor la atribución de la diferencia de los grupos a los cuales pertenecen dichos objetos o individuos. Es una técnica que nos permite comprobar hasta qué punto las variables independientes consideradas en la investigación clasifican correctamente a los sujetos u objetos. Se muestran y explican los principales elementos que se relacionan con el procedimiento para llevar a cabo el análisis discriminante y su aplicación utilizando el paquete estadístico SPSS, versión 18, para el desarrollo del modelo estadístico, las condiciones para la aplicación del análisis, la estimación e interpretación de las funciones discriminantes, los métodos de clasificación y la validación de los resultados.
Resumo:
A statistical indentation method has been employed to study the hardness value of fire-refined high conductivity copper, using nanoindentation technique. The Joslin and Oliver approach was used with the aim to separate the hardness (H) influence of copper matrix, from that of inclusions and grain boundaries. This approach relies on a large array of imprints (around 400 indentations), performed at 150 nm of indentation depth. A statistical study using a cumulative distribution function fit and Gaussian simulated distributions, exhibits that H for each phase can be extracted when the indentation depth is much lower than the size of the secondary phases. It is found that the thermal treatment produces a hardness increase, due to the partly re-dissolution of the inclusions (mainly Pb and Sn) in the matrix.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
We present molecular dynamics (MD) simulations results for dense fluids of ultrasoft, fully penetrable particles. These are a binary mixture and a polydisperse system of particles interacting via the generalized exponential model, which is known to yield cluster crystal phases for the corresponding monodisperse systems. Because of the dispersity in the particle size, the systems investigated in this work do not crystallize and form disordered cluster phases. The clusteringtransition appears as a smooth crossover to a regime in which particles are mostly located in clusters, isolated particles being infrequent. The analysis of the internal cluster structure reveals microsegregation of the big and small particles, with a strong homo-coordination in the binary mixture. Upon further lowering the temperature below the clusteringtransition, the motion of the clusters" centers-of-mass slows down dramatically, giving way to a cluster glass transition. In the cluster glass, the diffusivities remain finite and display an activated temperature dependence, indicating that relaxation in the cluster glass occurs via particle hopping in a nearly arrested matrix of clusters. Finally we discuss the influence of the microscopic dynamics on the transport properties by comparing the MD results with Monte Carlo simulations.
Resumo:
En aquest treball s’ha estudiat el comportament de compostos antimalàrics com els fàrmacs i els polímers en diferents situacions. Una de les barreres que ha estat identificada com a principal obstacle per a una millora de l’eficàcia dels compostos antimalàrics, és la limitació en la quantitat de fàrmac que pot ser encapsulada dins un liposoma, i que depèn de la seva solubilitat en medi aquós. Amb la inspiració de la descripció d’un nou tipus de nanocàpsules amb aplicacions oncològiques capaces d’encapsular grans quantitats de fàrmacs (protocells, Ashley et al., 2011). Els constructes formats per liposomes amb un nucli d’òxid de silici altament porós capaç de contenir el fàrmac, s’anomenen “protocells”, que en comparació als liposomes, tenen una major selectivitat i estabilitat, i permeten alliberar altes concentracions de droga directament al citosol de les cèl·lules cancerígenes. Aquest estudi es basa en la fabricació d’aquests nous nanovectors que continguin fàrmacs antimalàrics i té com a objectiu futur dirigir-los a eritròcits infectats per malària (pRBCs). Una altra part del treball és l’estudi de la distribució del polímer ISA-FITC en Anopheles atroparvus. Sabent que els polímers han estat utilitzats com a transportadors antimalàrics, es va pensar en l’opció d’eliminar el paràsit a dins del mateix mosquit, com una alternativa a tots el estudis realitzats fins ara centrats en les etapes d’infecció de l’hoste. Per aquest motiu es va idear l’experiment pensant en aquest polímer amb la intenció final de veure la seva localització en un mosquit Anopheles lliure del paràsit. OBJECTIUS: Determinació de la capacitat encapsuladora de tres tipus de nanopartícules, fabricades amb el mateix material però amb característiques de mida i càrrega diferents, incubant-les amb cinc fàrmacs antimalàrics. El blau de metilè, la primaquina, la cloroquina, la quinina i la curcumina, cadascun d’ells amb característiques de pH, solubilitat i estructura diferents. Alguns d’ells són fàrmacs que no s’han emprat en altres estudis degut a la seva toxicitat o elevada inespecificitat (la qual es pretén reduir un cop encapsulats en protocells). Construcció de “protocells” un cop determinada la millor nanopartícula encapsuladora i fàrmac candidat i determinació de la concentració de fàrmac que podien contenir, i el ritme d’alliberament d’aquest en PBS (simulant les condicions fisiològiques dels pRBCs). Estudi de la localització del polímer antimalàric ISA-FITC en l’anatomia del mosquit Anopheles Atroparvus. PROCEDIMENTS: Mètodes espectrofotomètrics Microscopia Cryo-electrònica de transmissió Microscopia confocal de fluorescència
Resumo:
En l"actualitat és difícil parlar de processos estadístics d"anàlisi quantitativa de dades sense fer referència a la informàtica aplicada a la recerca. Aquests recursos informàtics es basen sovint en paquets de programes informàtics que tenen l"objectiu d"ajudar al/a la investigador/a en la fase d"anàlisi de dades. En aquests moments un dels paquets més perfeccionats i complets és l"SPSS (Statistical Package for the Social Sciences). L"SPSS és un paquet de programes per tal de dur a terme l"anàlisi estadística de les dades. Constitueix una aplicació estadística força potent, de la qual s"han anat desenvolupant diverses versions des dels seus inicis, als anys setanta. En aquest manual les sortides d"ordinador que es presenten pertanyen a la versió 11.0.1. No obstant això, tot i que la forma ha anat variant des dels inicis, pel que fa al funcionament segueix essent molt similar entre les diferents versions. Abans d"iniciar-nos en la utilització de les aplicacions de l"SPSS és important familiaritzarse amb algunes de les finestres que més farem servir. En entrar a l"SPSS el primer que ens trobem és l"editor de dades. Aquesta finestra visualitza, bàsicament, les dades que anirem introduint. L"editor de dades inclou dues opcions: la Vista de les dades i la de les variables. Aquestes opcions es poden seleccionar a partir de les dues pestanyes que es presenten en la part inferior. La vista de dades conté el menú general i la matriu de dades. Aquesta matriu s"estructura amb els casos a les files i les variables a les columnes.
Resumo:
We investigate under which dynamical conditions the Julia set of a quadratic rational map is a Sierpiński curve.
Resumo:
El pràcticum és un entorn privilegiat per a la transferència de competències. Des d"aquesta perspectiva, s"ha desenvolupat el projecte «Anàlisi i avaluació de la transferibilitat de competències professionals de l"Educació Social en els centres de pràctiques» (2008MQD155) a la Universitat de Barcelona durant els cursos 2008-2010. La complexitat de l"objecte d"estudi aconsellava la utilització de mètodes qualitatius. Es va optar per un disseny d"investigació-acció a partir d"espais de reflexió. Des de la percepció dels participants, s"han identificat un conjunt d"elements que afavoreixen la transferència de competències en el marc del pràcticum, entre els quals cal destacar la necessitat d"una planificació conjunta i d"una planificació específica des de cadascun dels escenaris formatius vertebrada des de la reflexió-acció de la pràctica de l"estudiant. Per fer-ho possible, és necessari que els centres de pràctiques i els centres universitaris es reconeguin corresponsables de la formació pràctica.
Resumo:
This symposium presents research from different contexts to improve our collective understanding of a variety of aspects of mixed forms of service delivery, be they mixed contracting at the level of the market (which is more common in the U.S.), or mixed management and ownership at the level of the firm (which is more common in Europe). The articles included in this special symposium examine the factors that give rise to mixed forms of service delivery (e.g., economic and fiscal stress, regulatory flexibility, geography, management) and how these factors impact their design and operation. Articles also explore the performance of mixed forms of service delivery relative to more conventional arrangements like contracted or direct service delivery. The articles contribute to a better theoretical and conceptual understanding of mixed/hybrid forms of services delivery.
Resumo:
From 6 to 8 November 1982 one of the most catastrophic flash-flood events was recorded in the Eastern Pyrenees affecting Andorra and also France and Spain with rainfall accumulations exceeding 400 mm in 24 h, 44 fatalities and widespread damage. This paper aims to exhaustively document this heavy precipitation event and examines mesoscale simulations performed by the French Meso-NH non-hydrostatic atmospheric model. Large-scale simulations show the slow-evolving synoptic environment favourable for the development of a deep Atlantic cyclone which induced a strong southerly flow over the Eastern Pyrenees. From the evolution of the synoptic pattern four distinct phases have been identified during the event. The mesoscale analysis presents the second and the third phase as the most intense in terms of rainfall accumulations and highlights the interaction of the moist and conditionally unstable flows with the mountains. The presence of a SW low level jet (30 m s-1) around 1500 m also had a crucial role on focusing the precipitation over the exposed south slopes of the Eastern Pyrenees. Backward trajectories based on Eulerian on-line passive tracers indicate that the orographic uplift was the main forcing mechanism which triggered and maintained the precipitating systems more than 30 h over the Pyrenees. The moisture of the feeding flow mainly came from the Atlantic Ocean (7-9 g kg-1) and the role of the Mediterranean as a local moisture source was very limited (2-3 g kg-1) due to the high initial water vapour content of the parcels and the rapid passage over the basin along the Spanish Mediterranean coast (less than 12 h).