912 resultados para Dwarf Galaxy Fornax Distribution Function Action Based
Resumo:
A modification of the Dubinin-Radushkevich pore filling model by incorporation of the repulsive contribution to the pore potential, and of bulk non-ideality, is proposed in this paper for characterization of activated carbon using liquid phase adsorption. For this purpose experiments have been performed using ethyl propionate, ethyl butyrate, and ethyl isovalerate as adsorbates and the microporous-mesoporous activated carbons Filtrasorb 400, Norit ROW 0.8 and Norit ROX 0.8 as adsorbents. The repulsive contribution to the pore potential is incorporated through a Lennard-Jones intermolecular potential model, and the bulk-liquid phase non-ideality through the UNIFAC activity coefficient model. For the characterization of activated carbons, the generalized adsorption isotherm is utilized with a bimodal gamma function as the pore size distribution function. It is found that the model can represent the experimental data very well, and significantly better than when the classical energy-size relationship is used, or when bulk non-ideality is neglected. Excellent agreement between the bimodal gamma pore size distribution and DFT-cum-regularization based pore size distribution is also observed, supporting the validity of the proposed model. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
A bituminous coal was pyrolyzed in a nitrogen stream in an entrained flow reactor at various temperatures from 700 to 1475 degreesC. Char samples were collected at different positions along the reactor. Each collected sample was oxidized nonisothermally in a TGA for reactivity determination. The reactivity of the coal char was found to decrease rapidly with residence time until 0.5 s, after which it decreased only slightly. On the bases of the reactivity data at various temperatures, a new approach was utilized to obtaining the true activation energy distribution function for thermal annealing without the assumption of any distribution function form or a constant preexponential factor. It appears that the true activation energy distribution function consists of two separate parts corresponding to different temperature ranges, suggesting different mechanisms in different temperature ranges. Partially burnt coal chars were also collected along the reactor when the coal was oxidized in air at various temperatures from 700 to 1475 degreesC. The collected samples were analyzed for the residual carbon content and the specific reaction rate was estimated. The characteristic time of thermal deactivation was compared with that of oxidation under realistic conditions. The characteristic times were found to be close to each other, indicating the importance of thermal deactivation during combustion of the coal studied.
Resumo:
The design of magnetic cores can be carried out by taking into account the optimization of different parameters in accordance with the application requirements. Considering the specifications of the fast field cycling nuclear magnetic resonance (FFC-NMR) technique, the magnetic flux density distribution, at the sample insertion volume, is one of the core parameters that needs to be evaluated. Recently, it has been shown that the FFC-NMR magnets can be built on the basis of solenoid coils with ferromagnetic cores. Since this type of apparatus requires magnets with high magnetic flux density uniformity, a new type of magnet using a ferromagnetic core, copper coils, and superconducting blocks was designed with improved magnetic flux density distribution. In this paper, the designing aspects of the magnet are described and discussed with emphasis on the improvement of the magnetic flux density homogeneity (Delta B/B-0) in the air gap. The magnetic flux density distribution is analyzed based on 3-D simulations and NMR experimental results.
Resumo:
RESUMO: Contexto: O funcionamento tem sido reconhecido como um dos principais indicadores de resultados para avaliar se as pessoas beneficiam das intervenções destinadas a melhorar a sua saúde mental. O funcionamento refere-se à forma como um indivíduo consegue responder às suas tarefas e solicitações, dos seus familiares e da sua comunidade, de acordo com os requisitos do local e a cultura em que vive (eg, tarefa de cozinhar e limpar para as mulheres em algumas culturas ). O funcionamento é altamente dependente da cultura - por isso, tem sido recomendado o desenvolvimento de medidas de funcionamento específicas de cada cultura. Desenvolver localmente os instrumentos de medida evita problemas de adequação, associados com a adaptação de instrumentos ocidentais. Embora os instrumentos criados desta forma sejam específicos de um meio cultural, eles são simultaneamente "transculturais", no sentido em que cada um se refere às tarefas mais importantes para a população local . Esta abordagem mostrou-se útil para investigadores e agências de ajuda (eg, ONGs) que trabalham em países não-ocidentais . Este estudo descreve o trabalho da agência International Medical Corps (IMC) na criação e validação de um questionário de funcionamento específico nas dimensões cultura e gênero, no Líbano, destinado a avaliar eventuais melhorias em pessoas que receberam intervenções de para problemas de saúde mental, a nível dos cuidados primários de saúde. Método: O instrumento foi desenvolvido usando um método que é uma alternativa à abordagem existente de adaptação de instrumentos ocidentais a outras culturas e situações; esta abordagem é rápida e exequível, tendo já demonstrado ser útil no desenvolvimento de instrumentos válidos e fidedignos. Inicialmente, foi solicitado que as pessoas identificassem, de uma lista livre, as tarefas mais importantes para cuidar de si próprias, da sua família e da sua comunidade; as tarefas identificadas foram posteriormente usadas como base para um instrumento de avaliação de funcionamento culturalmente válido. A partir daqui, foram desenvolvidos questionários específicos da comunidade em questão, posteriormente testados no terreno nas vertentes da validade (de conteúdo, facial e de constructo) e da fiabilidade (teste-reste e inter-entrevistadores). Resultados. O estudo resultou na criação e validação de um questionário de funcionamento específico de cultura e gênero capaz de medir efectivamente a capacidade de execução de tarefas importantes do quotidiano,como parte da avaliação de resultados levada a cabo por profissionais da CSP previamente treinados na identificação, suporte e encaminhamento de pessoas com problemas de saúde mental no Líbano. Conclusão. Neste trabalho descreve-se o desenvolvimento de um questionário de funcionamento específico de cultura e gênero, orientado para a avaliação de resultados, num contexto mais lato de um sistema abrangente de avaliação e monitorização de um serviço comunitário. --------------ABSTRACT: Background. Functioning has been recognized as one of the most important key outcomes to assess whether people benefit from interventions aimed to improve their mental health. Functioning refers to how well na individual can complete the tasks and demands for themselves, their family, and their community which are required by them depending on the setting and the culture they live in (e.g. task of cooking and cleaning for women in some cultures). Functioning is highly dependent on culture. Therefore, it has been recommended to develop culture-specific measures of function. Developing instruments locally avoids the problems of limited local relevance and appropriateness associate with adapting western instruments. Although each instrument created in this way is culturally bound, they are “cross cultural” in the sense that each refers to the tasks most important to local people. This approach proves useful for both researchers and aid agencies working in non-western countries. This study describes International Medical Corps’ (IMC) work in Lebanon to create and validate a culture and gender specific functioning questionnaire to assess improvements in people who received treatment interventions for mental health problems at the primary health care (PHC) level. Method. The measure was developed using a method that is an alternative to the existing approach of adapting western function instruments to other cultures and situations; an approach which has been demonstrated as rapid, feasible and which can yield valid and reliable instruments. Function was assessed by first asking local people what tasks are important to care for themselves, their family and their community using free listing, then using these tasks as the basis for a culturally valid function assessment instrument. Community specific function questionnaires based on these tasks were then created, and field-tested for validity using content, face and construct validity methods, and also field tested for reliability using inter-rater and test retest reliability methods. Results. The study resulted in the creation and validation of a culture and gender specific functioning questionnaire that would effectively measure the ability to do tasks important to daily existence, as part of assessing client level outcomes where PHC providers were trained in the identification, management and referral of people with mental health problems in Lebanon. Conclusion. The paper describes a successful pilot for developing culture and gender specific functioning questionnaires that evaluate client level outcomes as part of a more comprehensive system for monitoring and evaluation of community based case management supports and services.
Resumo:
The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...
Resumo:
Unlike fragmental rockfall runout assessments, there are only few robust methods to quantify rock-mass-failure susceptibilities at regional scale. A detailed slope angle analysis of recent Digital Elevation Models (DEM) can be used to detect potential rockfall source areas, thanks to the Slope Angle Distribution procedure. However, this method does not provide any information on block-release frequencies inside identified areas. The present paper adds to the Slope Angle Distribution of cliffs unit its normalized cumulative distribution function. This improvement is assimilated to a quantitative weighting of slope angles, introducing rock-mass-failure susceptibilities inside rockfall source areas previously detected. Then rockfall runout assessment is performed using the GIS- and process-based software Flow-R, providing relative frequencies for runout. Thus, taking into consideration both susceptibility results, this approach can be used to establish, after calibration, hazard and risk maps at regional scale. As an example, a risk analysis of vehicle traffic exposed to rockfalls is performed along the main roads of the Swiss alpine valley of Bagnes.
Resumo:
Objectives: Gentamicin is among the most commonly prescribed antibiotics in newborns, but large interindividual variability in exposure levels exists. Based on a population pharmacokinetic analysis of a cohort of unselected neonates, we aimed to validate current dosing recommendations from a recent reference guideline (Neofax®). Methods: From 3039 concentrations collected in 994 preterm (median gestational age 32.3 weeks, range 24.2-36.5) and 455 term newborns, treated at the University Hospital of Lausanne between 2006 and 2011, a population pharmacokinetic analysis was performed with NONMEM®. Model-based simulations were used to assess the ability of dosing regimens to bring concentrations into targets: trough ≤ 1mg/L and peak ~ 8mg/L. Results: A two-compartment model best characterized gentamicin pharmacokinetics. Model parameters are presented in the table. Body weight, gestational age and postnatal age positively influence clearance, which decreases under dopamine administration. Body weight and gestational age influence the distribution volume. Model based simulations confirm that preterm infants need doses superior to 4 mg/kg, and extended dosage intervals, up to 48 hours for very preterm newborns, whereas most term newborns would achieve adequate exposure under 4 mg/kg q. 24 h. More than 90% of neonates would achieve trough concentrations below 2 mg/L and peaks above 6 mg/L following most recent guidelines. Conclusions: Simulated gentamicin exposure demonstrates good accordance with recent dosing recommendations for target concentration achievement.
Resumo:
A method to estimate an extreme quantile that requires no distributional assumptions is presented. The approach is based on transformed kernel estimation of the cumulative distribution function (cdf). The proposed method consists of a double transformation kernel estimation. We derive optimal bandwidth selection methods that have a direct expression for the smoothing parameter. The bandwidth can accommodate to the given quantile level. The procedure is useful for large data sets and improves quantile estimation compared to other methods in heavy tailed distributions. Implementation is straightforward and R programs are available.
Resumo:
Many studies have forecasted the possible impact of climate change on plant distribution using models based on ecological niche theory. In their basic implementation, niche-based models do not constrain predictions by dispersal limitations. Hence, most niche-based modelling studies published so far have assumed dispersal to be either unlimited or null. However, depending on the rate of climatic change, the landscape fragmentation and the dispersal capabilities of individual species, these assumptions are likely to prove inaccurate, leading to under- or overestimation of future species distributions and yielding large uncertainty between these two extremes. As a result, the concepts of "potentially suitable" and "potentially colonisable" habitat are expected to differ significantly. To quantify to what extent these two concepts can differ, we developed MIGCLIM, a model simulating plant dispersal under climate change and landscape fragmentation scenarios. MIGCLIM implements various parameters, such as dispersal distance, increase in reproductive potential over time, barriers to dispersal or long distance dispersal. Several simulations were run for two virtual species in a study area of the western Swiss Alps, by varying dispersal distance and other parameters. Each simulation covered the hundred-year period 2001-2100 and three different IPCC-based temperature warming scenarios were considered. Our results indicate that: (i) using realistic parameter values, the future potential distributions generated using MIGCLIM can differ significantly (up to more than 95% decrease in colonized surface) from those that ignore dispersal; (ii) this divergence increases both with increasing climate warming and over longer time periods; (iii) the uncertainty associated with the warming scenario can be nearly as large as the one related to dispersal parameters; (iv) accounting for dispersal, even roughly, can importantly reduce uncertainty in projections.
Resumo:
The energy and structure of a dilute hard-disks Bose gas are studied in the framework of a variational many-body approach based on a Jastrow correlated ground-state wave function. The asymptotic behaviors of the radial distribution function and the one-body density matrix are analyzed after solving the Euler equation obtained by a free minimization of the hypernetted chain energy functional. Our results show important deviations from those of the available low density expansions, already at gas parameter values x~0.001 . The condensate fraction in 2D is also computed and found generally lower than the 3D one at the same x.
Resumo:
It is a well known phenomenon that the constant amplitude fatigue limit of a large component is lower than the fatigue limit of a small specimen made of the same material. In notched components the opposite occurs: the fatigue limit defined as the maximum stress at the notch is higher than that achieved with smooth specimens. These two effects have been taken into account in most design handbooks with the help of experimental formulas or design curves. The basic idea of this study is that the size effect can mainly be explained by the statistical size effect. A component subjected to an alternating load can be assumed to form a sample of initiated cracks at the end of the crack initiation phase. The size of the sample depends on the size of the specimen in question. The main objective of this study is to develop a statistical model for the estimation of this kind of size effect. It was shown that the size of a sample of initiated cracks shall be based on the stressed surface area of the specimen. In case of varying stress distribution, an effective stress area must be calculated. It is based on the decreasing probability of equally sized initiated cracks at lower stress level. If the distribution function of the parent population of cracks is known, the distribution of the maximum crack size in a sample can be defined. This makes it possible to calculate an estimate of the largest expected crack in any sample size. The estimate of the fatigue limit can now be calculated with the help of the linear elastic fracture mechanics. In notched components another source of size effect has to be taken into account. If we think about two specimens which have similar shape, but the size is different, it can be seen that the stress gradient in the smaller specimen is steeper. If there is an initiated crack in both of them, the stress intensity factor at the crack in the larger specimen is higher. The second goal of this thesis is to create a calculation method for this factor which is called the geometric size effect. The proposed method for the calculation of the geometric size effect is also based on the use of the linear elastic fracture mechanics. It is possible to calculate an accurate value of the stress intensity factor in a non linear stress field using weight functions. The calculated stress intensity factor values at the initiated crack can be compared to the corresponding stress intensity factor due to constant stress. The notch size effect is calculated as the ratio of these stress intensity factors. The presented methods were tested against experimental results taken from three German doctoral works. Two candidates for the parent population of initiated cracks were found: the Weibull distribution and the log normal distribution. Both of them can be used successfully for the prediction of the statistical size effect for smooth specimens. In case of notched components the geometric size effect due to the stress gradient shall be combined with the statistical size effect. The proposed method gives good results as long as the notch in question is blunt enough. For very sharp notches, stress concentration factor about 5 or higher, the method does not give sufficient results. It was shown that the plastic portion of the strain becomes quite high at the root of this kind of notches. The use of the linear elastic fracture mechanics becomes therefore questionable.
Resumo:
The aim of this study is to form the experience-based knowledge of diabetics. The broader intent is to be able to transform this experience-based knowledge as an asset within caring. In this study, a theoretical contact for the empirical data is presented through phronesis, i.e. practical wisdom. Phronesis can be seen as the most suitable form of knowledge to be able to deepen the individual's understanding of experiencebased knowledge. For this research, hermeneutic phenomenology was chosen. Abductive reasoning was the method chosen to approach the data collected through repeated deep interviews with individuals with personal experience of diabetes and the use of insulin pumps. The abductive approach fascilitates a broader interpretation of the primary empirical results via a theory of philosophy of science, such as phronesis, the life-world and the negativity of the experience. The latent message of the empirical data is thereby also additionally highlighted. The synthesis reveals that experience-based knowledge arrives with time, it is personified and praxis-oriented, and before this time, the knowledge and security must be provided by the established care, by people close to the individual or by other external sources. The experience-based knowledge has strenghts and weaknesses. The knowledge is further categorized by the individual's ability to discern and make judgement. Additionally, the experience-based kowledge is a reflecting and action-based knowledge striving to improve the care provided. The experience-based knowledge held by the individual is potentially a great instrument towards improving general knowledge with possible practical applications within the diabetic care. Furthermost, in practical suggestions to fascilitate care. In generally applying knowledge gathered from the individual's experiental point of view, there are inherent risks. These risks could potentially be eliminated through the adoption of a concept where the established care could function as a quality guarantor. A concept taking into account the experiencebased knowledge as a source of information and knowledge in the care for diabetics. Co-created knowledge and understanding is a position found in both self-care and pump-treatment. It is also found through the optimal application of the experience-based knowledge of the individual as well as the knowledge found within the established care, in order to fascilitate well-being. This as expressed by the individual's phronesis-based knowledge.
Resumo:
Solid state nuclear magnetic resonance (NMR) spectroscopy is a powerful technique for studying structural and dynamical properties of disordered and partially ordered materials, such as glasses, polymers, liquid crystals, and biological materials. In particular, twodimensional( 2D) NMR methods such as ^^C-^^C correlation spectroscopy under the magicangle- spinning (MAS) conditions have been used to measure structural constraints on the secondary structure of proteins and polypeptides. Amyloid fibrils implicated in a broad class of diseases such as Alzheimer's are known to contain a particular repeating structural motif, called a /5-sheet. However, the details of such structures are poorly understood, primarily because the structural constraints extracted from the 2D NMR data in the form of the so-called Ramachandran (backbone torsion) angle distributions, g{^,'4)), are strongly model-dependent. Inverse theory methods are used to extract Ramachandran angle distributions from a set of 2D MAS and constant-time double-quantum-filtered dipolar recoupling (CTDQFD) data. This is a vastly underdetermined problem, and the stability of the inverse mapping is problematic. Tikhonov regularization is a well-known method of improving the stability of the inverse; in this work it is extended to use a new regularization functional based on the Laplacian rather than on the norm of the function itself. In this way, one makes use of the inherently two-dimensional nature of the underlying Ramachandran maps. In addition, a modification of the existing numerical procedure is performed, as appropriate for an underdetermined inverse problem. Stability of the algorithm with respect to the signal-to-noise (S/N) ratio is examined using a simulated data set. The results show excellent convergence to the true angle distribution function g{(j),ii) for the S/N ratio above 100.
Resumo:
A feature-based fitness function is applied in a genetic programming system to synthesize stochastic gene regulatory network models whose behaviour is defined by a time course of protein expression levels. Typically, when targeting time series data, the fitness function is based on a sum-of-errors involving the values of the fluctuating signal. While this approach is successful in many instances, its performance can deteriorate in the presence of noise. This thesis explores a fitness measure determined from a set of statistical features characterizing the time series' sequence of values, rather than the actual values themselves. Through a series of experiments involving symbolic regression with added noise and gene regulatory network models based on the stochastic 'if-calculus, it is shown to successfully target oscillating and non-oscillating signals. This practical and versatile fitness function offers an alternate approach, worthy of consideration for use in algorithms that evaluate noisy or stochastic behaviour.
Resumo:
The purpose of this project was to discern the inherent tension present in narratives told by adolescents with a visual impairment as they attempted to make sense of their experiences, specifically those surrounding risk. Mediated action, based on the foundational work of Vygotsky and Bakhtin, was used as both a theoretical and methodological approach; it is the theory that there are two components that constitute any human action: the "agent," or the person who is doing the acting, and the "mediational means" that he or she is using to accomplish the action in question. Tension ensues as neither is able to fully explain human behaviour. Ten adolescents with a visual impairment participated in a narrative interview, revealing numerous counter-narratives surrounding risk-taking, including "experimentation undertaken using good judgment." Participants offered examples of how they engaged, appropriated, resisted and transformed the dominant narratives of disability and adolescence in their identity formation.