899 resultados para particle trajectory computation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract:The objective of this work was to evaluate the effect of limestone particle sizes in the diet and of lighting regimes on the egg and bone quality and on the performance of commercial laying hens. Three hundred Hissex White layers, at 18 weeks of age, were distributed in a completely randomized design, in a 5×2 factorial arrangement (coarse limestone in the diet at 0, 25, 50, 75, and 100%; with or without artificial light), with five replicates of six birds. No significant interaction was observed between particle sizes and lighting regime for the evaluated parameters. There was no significant effect of coarse limestone level in the diet on the performance and egg quality of hens; however, bone deformity (3.23 to 4.01 mm), strength (5.19 to 6.70 kgf cm-2), and mineral matter (51.09 to 59.61%) improved as the proportion of coarse limestone increased. For lighting regime, the treatment with artificial light yielded higher Haugh unit values (87.17 vs. 85.54) than that with natural light only. Greater limestone particles improve bone quality of laying hens, and the use of artificial light can benefit the albumen quality of the eggs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: We propose and validate a computer aided system to measure three different mandibular indexes: cortical width, panoramic mandibular index and, mandibular alveolar bone resorption index. Study Design: Repeatability and reproducibility of the measurements are analyzed and compared to the manual estimation of the same indexes. Results: The proposed computerized system exhibits superior repeatability and reproducibility rates compared to standard manual methods. Moreover, the time required to perform the measurements using the proposed method is negligible compared to perform the measurements manually. Conclusions: We have proposed a very user friendly computerized method to measure three different morphometric mandibular indexes. From the results we can conclude that the system provides a practical manner to perform these measurements. It does not require an expert examiner and does not take more than 16 seconds per analysis. Thus, it may be suitable to diagnose osteoporosis using dental panoramic radiographs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal vaccine strategies must be identified for improving T-cell vaccination against infectious and malignant diseases. MelQbG10 is a virus-like nano-particle loaded with A-type CpG-oligonucleotides (CpG-ODN) and coupled to peptide(16-35) derived from Melan-A/MART-1. In this phase IIa clinical study, four groups of stage III-IV melanoma patients were vaccinated with MelQbG10, given (i) with IFA (Montanide) s.c.; (ii) with IFA s.c. and topical Imiquimod; (iii) i.d. with topical Imiquimod; or (iv) as intralymph node injection. In total, 16/21 (76%) patients generated ex vivo detectable Melan-A/MART-1-specific T-cell responses. T-cell frequencies were significantly higher when IFA was used as adjuvant, resulting in detectable T-cell responses in all (11/11) patients, with predominant generation of effector-memory-phenotype cells. In turn, Imiquimod induced higher proportions of central-memory-phenotype cells and increased percentages of CD127(+) (IL-7R) T cells. Direct injection of MelQbG10 into lymph nodes resulted in lower T-cell frequencies, associated with lower proportions of memory and effector-phenotype T cells. Swelling of vaccine site draining lymph nodes, and increased glucose uptake at PET/CT was observed in 13/15 (87%) of evaluable patients, reflecting vaccine triggered immune reactions in lymph nodes. We conclude that the simultaneous use of both Imiquimod and CpG-ODN induced combined memory and effector CD8(+) T-cell responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The past few decades have seen a considerable increase in the number of parallel and distributed systems. With the development of more complex applications, the need for more powerful systems has emerged and various parallel and distributed environments have been designed and implemented. Each of the environments, including hardware and software, has unique strengths and weaknesses. There is no single parallel environment that can be identified as the best environment for all applications with respect to hardware and software properties. The main goal of this thesis is to provide a novel way of performing data-parallel computation in parallel and distributed environments by utilizing the best characteristics of difference aspects of parallel computing. For the purpose of this thesis, three aspects of parallel computing were identified and studied. First, three parallel environments (shared memory, distributed memory, and a network of workstations) are evaluated to quantify theirsuitability for different parallel applications. Due to the parallel and distributed nature of the environments, networks connecting the processors in these environments were investigated with respect to their performance characteristics. Second, scheduling algorithms are studied in order to make them more efficient and effective. A concept of application-specific information scheduling is introduced. The application- specific information is data about the workload extractedfrom an application, which is provided to a scheduling algorithm. Three scheduling algorithms are enhanced to utilize the application-specific information to further refine their scheduling properties. A more accurate description of the workload is especially important in cases where the workunits are heterogeneous and the parallel environment is heterogeneous and/or non-dedicated. The results obtained show that the additional information regarding the workload has a positive impact on the performance of applications. Third, a programming paradigm for networks of symmetric multiprocessor (SMP) workstations is introduced. The MPIT programming paradigm incorporates the Message Passing Interface (MPI) with threads to provide a methodology to write parallel applications that efficiently utilize the available resources and minimize the overhead. The MPIT allows for communication and computation to overlap by deploying a dedicated thread for communication. Furthermore, the programming paradigm implements an application-specific scheduling algorithm. The scheduling algorithm is executed by the communication thread. Thus, the scheduling does not affect the execution of the parallel application. Performance results achieved from the MPIT show that considerable improvements over conventional MPI applications are achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Convective transport, both pure and combined with diffusion and reaction, can be observed in a wide range of physical and industrial applications, such as heat and mass transfer, crystal growth or biomechanics. The numerical approximation of this class of problemscan present substantial difficulties clue to regions of high gradients (steep fronts) of the solution, where generation of spurious oscillations or smearing should be precluded. This work is devoted to the development of an efficient numerical technique to deal with pure linear convection and convection-dominated problems in the frame-work of convection-diffusion-reaction systems. The particle transport method, developed in this study, is based on using rneshless numerical particles which carry out the solution along the characteristics defining the convective transport. The resolution of steep fronts of the solution is controlled by a special spacial adaptivity procedure. The serni-Lagrangian particle transport method uses an Eulerian fixed grid to represent the solution. In the case of convection-diffusion-reaction problems, the method is combined with diffusion and reaction solvers within an operator splitting approach. To transfer the solution from the particle set onto the grid, a fast monotone projection technique is designed. Our numerical results confirm that the method has a spacial accuracy of the second order and can be faster than typical grid-based methods of the same order; for pure linear convection problems the method demonstrates optimal linear complexity. The method works on structured and unstructured meshes, demonstrating a high-resolution property in the regions of steep fronts of the solution. Moreover, the particle transport method can be successfully used for the numerical simulation of the real-life problems in, for example, chemical engineering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, equations for the calculation of erosion wear caused by ash particles on convective heat exchanger tubes of steam boilers are presented. Anew, three-dimensional test arrangement was used in the testing of the erosion wear of convective heat exchanger tubes of steam boilers. When using the sleeve-method, three different tube materials and three tube constructions could be tested. New results were obtained from the analyses. The main mechanisms of erosionwear phenomena and erosion wear as a function of collision conditions and material properties have been studied. Properties of fossil fuels have also been presented. When burning solid fuels, such as pulverized coal and peat in steam boilers, most of the ash is entrained by the flue gas in the furnace. In bubbling andcirculating fluidized bed boilers, particle concentration in the flue gas is high because of bed material entrained in the flue gas. Hard particles, such as sharp edged quartz crystals, cause erosion wear when colliding on convective heat exchanger tubes and on the rear wall of the steam boiler. The most important ways to reduce erosion wear in steam boilers is to keep the velocity of the flue gas moderate and prevent channelling of the ash flow in a certain part of the cross section of the flue gas channel, especially near the back wall. One can do this by constructing the boiler with the following components. Screen plates can beused to make the velocity and ash flow distributions more even at the cross-section of the channel. Shield plates and plate type constructions in superheaters can also be used. Erosion testing was conducted with three types of tube constructions: a one tube row, an inline tube bank with six tube rows, and a staggered tube bank with six tube rows. Three flow velocities and two particle concentrations were used in the tests, which were carried out at room temperature. Three particle materials were used: quartz, coal ash and peat ash particles. Mass loss, diameter loss and wall thickness loss measurements of the test sleeves were taken. Erosion wear as a function of flow conditions, tube material and tube construction was analyzed by single-variable linear regression analysis. In developing the erosion wear calculation equations, multi-variable linear regression analysis was used. In the staggered tube bank, erosion wear had a maximum value in a tube row 2 and a local maximum in row 5. In rows 3, 4 and 6, the erosion rate was low. On the other hand, in the in-line tube bank the minimum erosion rate occurred in tube row 2 and in further rows the erosion had an increasing value, so that in a six row tube bank, the maximum value occurred in row 6.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La distribución del número y del volumen de partículas, y la eficiencia de eliminación de las partículas y los sólidos en suspensión de diferentes efluentes y sus filtrados, fueron analizadas para estudiar si los filtros más usuales en los sistemas de riego localizado eliminan las partículas que pueden obturar los goteros. En la mayoría de los efluentes y filtrados fue mínimo el número de partículas con diámetros superiores a 20 μm. Sin embargo, al analizar la distribución del volumen de las partículas, en los filtrados aparecieron partículas de dimensiones superiores a la luz de los filtros de anillas y malla, siendo el filtro de arena el que retuvo las partículas de mayor diámetro. La eficiencia de los filtros para retener partículas se debió más al tipo de efluente que al filtro. Se verificó también que la distribución del número de partículas sigue una relación de tipo potencial. Analizando el exponente β de la ley potencial, se halló que los filtros no modificaron significativamente la distribución del número de partículas de los efluentes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In two previous papers [J. Differential Equations, 228 (2006), pp. 530 579; Discrete Contin. Dyn. Syst. Ser. B, 6 (2006), pp. 1261 1300] we have developed fast algorithms for the computations of invariant tori in quasi‐periodic systems and developed theorems that assess their accuracy. In this paper, we study the results of implementing these algorithms and study their performance in actual implementations. More importantly, we note that, due to the speed of the algorithms and the theoretical developments about their reliability, we can compute with confidence invariant objects close to the breakdown of their hyperbolicity properties. This allows us to identify a mechanism of loss of hyperbolicity and measure some of its quantitative regularities. We find that some systems lose hyperbolicity because the stable and unstable bundles approach each other but the Lyapunov multipliers remain away from 1. We find empirically that, close to the breakdown, the distances between the invariant bundles and the Lyapunov multipliers which are natural measures of hyperbolicity depend on the parameters, with power laws with universal exponents. We also observe that, even if the rigorous justifications in [J. Differential Equations, 228 (2006), pp. 530-579] are developed only for hyperbolic tori, the algorithms work also for elliptic tori in Hamiltonian systems. We can continue these tori and also compute some bifurcations at resonance which may lead to the existence of hyperbolic tori with nonorientable bundles. We compute manifolds tangent to nonorientable bundles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that the quasifission paths predicted by the one-body dissipation dynamics, in the slowest phase of a binary reaction, follow a quasistatic path, which represents a sequence of states of thermal equilibrium at a fixed value of the deformation coordinate. This establishes the use of the statistical particle-evaporation model in the case of dynamical time-evolving systems. Pre- and post-scission multiplicities of neutrons and total multiplicities of protons and α particles in fission reactions of 63Cu+92Mo, 60Ni+100Mo, 63Cu+100Mo at 10 MeV/u and 20Ne+144,148,154Sm at 20 MeV/u are reproduced reasonably well with statistical model calculations performed along dynamic trajectories whose slow stage (from the most compact configuration up to the point where the neck starts to develop) lasts some 35×10−21 s.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Schizophrenia is a neurodevelopmental disorder reflecting a convergence of genetic risk and early life stress. The slow progression to first psychotic episode represents both a window of vulnerability as well as opportunity for therapeutic intervention. Here, we consider recent neurobiological insight into the cellular and molecular components of developmental critical periods and their vulnerability to redox dysregulation. In particular, the consistent loss of parvalbumin-positive interneuron (PVI) function and their surrounding perineuronal nets (PNNs) as well as myelination in patient brains is consistent with a delayed or extended period of circuit instability. This linkage to critical period triggers (PVI) and brakes (PNN, myelin) implicates mistimed trajectories of brain development in mental illness. Strategically introduced antioxidant treatment or later reinforcement of molecular brakes may then offer a novel prophylactic psychiatry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract. The deep outer margin of the Gulf of Lions and the adjacent basin, in the western Mediterranean Sea, are regularly impacted by open-ocean convection, a major hydrodynamic event responsible for the ventilation of the deep water in the western Mediterranean Basin. However, the impact of open-ocean convection on the flux and transport of particulate matter remains poorly understood. The variability of water mass properties (i.e., temperature and salinity), currents, and particle fluxes were monitored between September 2007 and April 2009 at five instrumented mooring lines deployed between 2050 and 2350-m depth in the deepest continental margin and adjacent basin. Four of the lines followed a NW-SE transect, while the fifth one was located on a sediment wave field to the west. The results of the main, central line SC2350 ("LION") located at 42 02.50 N, 4 410 E, at 2350-m depth, show that open-ocean convection reached midwater depth ( 1000-m depth) during winter 2007-2008, and reached the seabed ( 2350-m depth) during winter 2008-2009. Horizontal currents were unusually strong with speeds up to 39 cm s−1 during winter 2008-2009. The measurements at all 5 different locations indicate that mid-depth and near-bottom currents and particle fluxes gave relatively consistent values of similar magnitude across the study area except during winter 2008-2009, when near-bottom fluxes abruptly increased by one to two orders of magnitude. Particulate organic carbon contents, which generally vary between 3 and 5 %, were abnormally low ( 1 %) during winter 2008-2009 and approached those observed in surface sediments (0.6 %). Turbidity profiles made in the region demonstrated the existence of a bottom nepheloid layer, several hundred meters thick, and related to the resuspension of bottom sediments. These observations support the view that open-ocean deep convection events in the Gulf of Lions can cause significant remobilization of sediments in the deep outer margin and the basin, with a subsequent alteration of the seabed likely impacting the functioning of the deep-sea ecosystem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Drug metabolism can produce metabolites with physicochemical and pharmacological properties that differ substantially from those of the parent drug, and consequently has important implications for both drug safety and efficacy. To reduce the risk of costly clinical-stage attrition due to the metabolic characteristics of drug candidates, there is a need for efficient and reliable ways to predict drug metabolism in vitro, in silico and in vivo. In this Perspective, we provide an overview of the state of the art of experimental and computational approaches for investigating drug metabolism. We highlight the scope and limitations of these methods, and indicate strategies to harvest the synergies that result from combining measurement and prediction of drug metabolism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Raaka-aineen hiukkaskoko on lääkekehityksessä keskeinen materiaaliparametri. Lääkeaineen partikkelikoko vaikuttaa moneen lääketuotteen tärkeään ominaisuuteen, esimerkiksi lääkkeen biologiseen hyväksikäytettävyyteen. Tässä diplomityössä keskityttiin jauhemaisten lääkeaineiden hiukkaskoon määrittämiseen laserdiffraktiomenetelmällä. Menetelmä perustuu siihen, että partikkeleista sironneen valon intensiteetin sirontakulmajakauma on riippuvainen partikkelien kokojakaumasta. Työn kirjallisuusosassa esiteltiin laserdiffraktiomenetelmän teoriaa. PIDS (Polarization Intensity Differential Scattering) tekniikka, jota voidaan käyttää laserdiffraktion yhteydessä, on myös kuvattu kirjallisuusosassa. Muihin menetelmiin perustuvista analyysimenetelmistä tutustuttiin mikroskopiaan sekä aerodynaamisen lentoajan määrittämiseen perustuvaan menetelmään. Kirjallisuusosassa esiteltiin myös partikkelikoon yleisimpiä esitystapoja. Työn kokeellisen osan tarkoituksena oli kehittää ja validoida laserdiffraktioon perustuva partikkelikoon määritysmenetelmä tietylle lääkeaineelle. Menetelmäkehitys tehtiin käyttäen Beckman Coulter LS 13 320 laserdiffraktoria. Laite mahdollistaa PIDS-tekniikan käytön laserdiffraktiotekniikan ohella. Menetelmäkehitys aloitettiin arvioimalla, että kyseinen lääkeaine soveltuu parhaiten määritettäväksi nesteeseen dispergoituna. Liukoisuuden perusteella väliaineeksi valittiin tällä lääkeaineella kyllästetty vesiliuos. Dispergointiaineen sekä ultraäänihauteen käyttö havaittiin tarpeelliseksi dispergoidessa kyseistä lääkeainetta kylläiseen vesiliuokseen. Lopuksi sekoitusnopeus näytteensyöttöyksikössä säädettiin sopivaksi. Validointivaiheessa kehitetyn menetelmän todettiin soveltuvan hyvin kyseiselle lääkeaineelle ja tulosten todettiin olevan oikeellisia sekä toistettavia. Menetelmä ei myöskään ollut herkkä pienille häiriöille.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The perceived low levels of genetic diversity, poor interspecific competitive and defensive ability, and loss of dispersal capacities of insular lineages have driven the view that oceanic islands are evolutionary dead ends. Focusing on the Atlantic bryophyte flora distributed across the archipelagos of the Azores, Madeira, the Canary Islands, Western Europe, and northwestern Africa, we used an integrative approach with species distribution modeling and population genetic analyses based on approximate Bayesian computation to determine whether this view applies to organisms with inherent high dispersal capacities. Genetic diversity was found to be higher in island than in continental populations, contributing to mounting evidence that, contrary to theoretical expectations, island populations are not necessarily genetically depauperate. Patterns of genetic variation among island and continental populations consistently fitted those simulated under a scenario of de novo foundation of continental populations from insular ancestors better than those expected if islands would represent a sink or a refugium of continental biodiversity. We, suggest that the northeastern Atlantic archipelagos have played a key role as a stepping stone for transoceanic migrants. Our results challenge the traditional notion that oceanic islands are the end of the colonization road and illustrate the significant role of oceanic islands as reservoirs of novel biodiversity for the assembly of continental floras.