8 resultados para Make-believe
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
In Bohmian mechanics, a version of quantum mechanics that ascribes world lines to electrons, we can meaningfully ask about an electron's instantaneous speed relative to a given inertial frame. Interestingly, according to the relativistic version of Bohmian mechanics using the Dirac equation, a massive particle's speed is less than or equal to the speed of light, but not necessarily less. That is, there are situations in which the particle actually reaches the speed of light-a very nonclassical behavior. That leads us to the question of whether such situations can be arranged experimentally. We prove a theorem, Theorem 5, implying that for generic initial wave functions the probability that the particle ever reaches the speed of light, even if at only one point in time, is zero. We conclude that the answer to the question is no. Since a trajectory reaches the speed of light whenever the quantum probability current (psi) over bar gamma(mu)psi is a lightlike 4-vector, our analysis concerns the current vector field of a generic wave function and may thus be of interest also independently of Bohmian mechanics. The fact that the current is never spacelike has been used to argue against the possibility of faster-than-light tunneling through a barrier, a somewhat similar question. Theorem 5, as well as a more general version provided by Theorem 6, are also interesting in their own right. They concern a certain property of a function psi : R(4) -> C(4) that is crucial to the question of reaching the speed of light, namely being transverse to a certain submanifold of C(4) along a given compact subset of space-time. While it follows from the known transversality theorem of differential topology that this property is generic among smooth functions psi : R(4) -> C(4), Theorem 5 asserts that it is also generic among smooth solutions of the Dirac equation. (C) 2010 American Institute of Physics. [doi:10.1063/1.3520529]
Resumo:
Causal inference methods - mainly path analysis and structural equation modeling - offer plant physiologists information about cause-and-effect relationships among plant traits. Recently, an unusual approach to causal inference through stepwise variable selection has been proposed and used in various works on plant physiology. The approach should not be considered correct from a biological point of view. Here, it is explained why stepwise variable selection should not be used for causal inference, and shown what strange conclusions can be drawn based upon the former analysis when one aims to interpret cause-and-effect relationships among plant traits.
Resumo:
Clinical applications of quantitative computed tomography (qCT) in patients with pulmonary opacifications are hindered by the radiation exposure and by the arduous manual image processing. We hypothesized that extrapolation from only ten thoracic CT sections will provide reliable information on the aeration of the entire lung. CTs of 72 patients with normal and 85 patients with opacified lungs were studied retrospectively. Volumes and masses of the lung and its differently aerated compartments were obtained from all CT sections. Then only the most cranial and caudal sections and a further eight evenly spaced sections between them were selected. The results from these ten sections were extrapolated to the entire lung. The agreement between both methods was assessed with Bland-Altman plots. Median (range) total lung volume and mass were 3,738 (1,311-6,768) ml and 957 (545-3,019) g, the corresponding bias (limits of agreement) were 26 (-42 to 95) ml and 8 (-21 to 38) g, respectively. The median volumes (range) of differently aerated compartments (percentage of total lung volume) were 1 (0-54)% for the nonaerated, 5 (1-44)% for the poorly aerated, 85 (28-98)% for the normally aerated, and 4 (0-48)% for the hyperaerated subvolume. The agreement between the extrapolated results and those from all CT sections was excellent. All bias values were below 1% of the total lung volume or mass, the limits of agreement never exceeded +/- 2%. The extrapolation method can reduce radiation exposure and shorten the time required for qCT analysis of lung aeration.
Resumo:
The human brain is often considered to be the most cognitively capable among mammalian brains and to be much larger than expected for a mammal of our body size. Although the number of neurons is generally assumed to be a determinant of computational power, and despite the widespread quotes that the human brain contains 100 billion neurons and ten times more glial cells, the absolute number of neurons and glial cells in the human brain remains unknown. Here we determine these numbers by using the isotropic fractionator and compare them with the expected values for a human-sized primate. We find that the adult male human brain contains on average 86.1 +/- 8.1 billion NeuN-positive cells (""neurons"") and 84.6 +/- 9.8 billion NeuN-negative (""nonneuronal"") cells. With only 19% of all neurons located in the cerebral cortex, greater cortical size (representing 82% of total brain mass) in humans compared with other primates does not reflect an increased relative number of cortical neurons. The ratios between glial cells and neurons in the human brain structures are similar to those found in other primates, and their numbers of cells match those expected for a primate of human proportions. These findings challenge the common view that humans stand out from other primates in their brain composition and indicate that, with regard to numbers of neuronal and nonneuronal cells, the human brain is an isometrically scaled-up primate brain. J. Comp. Neurol. 513:532-541, 2009. (c) 2009 Wiley-Liss, Inc.
Resumo:
Process scheduling techniques consider the current load situation to allocate computing resources. Those techniques make approximations such as the average of communication, processing, and memory access to improve the process scheduling, although processes may present different behaviors during their whole execution. They may start with high communication requirements and later just processing. By discovering how processes behave over time, we believe it is possible to improve the resource allocation. This has motivated this paper which adopts chaos theory concepts and nonlinear prediction techniques in order to model and predict process behavior. Results confirm the radial basis function technique which presents good predictions and also low processing demands show what is essential in a real distributed environment.
Resumo:
Recently the paper ""Schwinger mechanism for gluon pair production in the presence of arbitrary time dependent chromo-electric field"" by G. C. Nayak was published [Eur. Phys. J. C 59: 715, 2009; arXiv:0708.2430]. Its aim is to obtain an exact expression for the probability of non-perturbative gluon pair production per unit time per unit volume and per unit transverse momentum in an arbitrary time-dependent chromo-electric background field. We believe that the obtained expression is open to question. We demonstrate its inconsistency on some well-known examples. We think that this is a consequence of using the socalled ""shift theorem""[arXiv:hep-th/0609192] in deriving the expression for the probability. We make some critical comments on the theorem and its applicability to the problem in question.
Complexity and anisotropy in host morphology make populations less susceptible to epidemic outbreaks
Resumo:
One of the challenges in epidemiology is to account for the complex morphological structure of hosts such as plant roots, crop fields, farms, cells, animal habitats and social networks, when the transmission of infection occurs between contiguous hosts. Morphological complexity brings an inherent heterogeneity in populations and affects the dynamics of pathogen spread in such systems. We have analysed the influence of realistically complex host morphology on the threshold for invasion and epidemic outbreak in an SIR (susceptible-infected-recovered) epidemiological model. We show that disorder expressed in the host morphology and anisotropy reduces the probability of epidemic outbreak and thus makes the system more resistant to epidemic outbreaks. We obtain general analytical estimates for minimally safe bounds for an invasion threshold and then illustrate their validity by considering an example of host data for branching hosts (salamander retinal ganglion cells). Several spatial arrangements of hosts with different degrees of heterogeneity have been considered in order to separately analyse the role of shape complexity and anisotropy in the host population. The estimates for invasion threshold are linked to morphological characteristics of the hosts that can be used for determining the threshold for invasion in practical applications.
Resumo:
Traditional venom immunotherapy uses injections of whole bee venom in buffer or adsorbed in Al (OH)(3) in an expensive, time-consuming way. New strategies to improve the safety and efficacy of this treatment with a reduction of injections would, therefore, be of general interest. It would improve patient compliance and provide socio-economic benefits. Liposomes have a long tradition in drug delivery because they increase the therapeutic index and avoid drug degradation and secondary effects. However, bee venom melittin (Mel) and phospholipase (PLA(2)) destroy the phospholipid membranes. Our central idea was to inhibit the PLA(2) and Mel activities through histidine alkylation and or tryptophan oxidation (with pbb, para-bromo-phenacyl bromide, and/or NBSN-bromosuccinimide, respectively) to make their encapsulations possible within stabilized liposomes. We strongly believe that this formulation will be nontoxic but immunogenic. In this paper, we present the whole bee venom conformation characterization during and after chemical modification and after interaction with liposome by ultraviolet, circular dichroism, and fluorescence spectroscopies. The PLA(2) and Mel activities were, measured indirectly by changes in turbidity at 400(nm), rhodamine leak-out, and hemolysis. The native whole bee venom (BV) presented 78.06% of alpha-helical content. The alkylation (A-BV) and succynilation (S-BV) of BV increased 0.44 and 0.20% of its alpha-helical content. The double-modified venom (S-A-BV) had a 0.74% increase of alpha-helical content. The BV chemical modification induced another change on protein conformations observed by Trp that became buried with respect to the native whole BV. It was demonstrated that the liposomal membranes must contain pbb (SPC:Cho:pbb, 26:7:1) as a component to protect them from aggregation and/or fusion. The membranes containing pbb maintained the same turbidity (100%) after incubation with modified venom, in contrast with pbb-free membranes that showed a 15% size decrease. This size decrease was interpreted as membrane degradation and was corroborated by a 50% rhodamine leak-out. Another fact that confirmed our interpretation was the observed 100% inhibition of the hemolytic activity after venom modification with pbb and NBS (S-A-BV). When S-A-BV interacted with liposomes, other protein conformational changes were observed and characterized by the increase of 1.93% on S-A-BV alpha-helical content and the presence of tryptophan residues in a more hydrophobic environment. In other words, the S-A-BV interacted with liposomal membranes, but this interaction was not effective to cause aggregation, leak-out, or fusion. A stable formulation composed by S-A-BV encapsulated within liposomes composed by SPC:Cho:pbb, at a ratio of 26:7:1, was devised. Large unilamellar vesicles of 202.5 nm with a negative surface charge (-24.29 mV) encapsulated 95% of S-A-BV. This formulation can, now, be assayed on VIT.