960 resultados para Serial killers
Resumo:
Animal models of bone marrow transplantation (BMT) allow evaluation of new experimental treatment strategies. One potential strategy involves the treatment of donor marrow with ultra-violet B light to allow transplantation across histocompatibility boundaries without an increase in graft rejection or graft-versus-host disease. A major requirement for a new experimental protocol, particularly if it involves manipulation of the donor marrow, is that the manipulated marrow gives rise to long-term multilineage engraftment. DNA based methodologies are now routinely used by many centres to evaluate engraftment and degree of chimaerism post-BMT in humans. We report the adaptation of this methodology to the serial study of engraftment in rodents. Conditions have been defined which allow analysis of serial tail vein samples using PCR of short tandem repeat sequences (STR-PCR). These markers have been used to evaluate the contribution of ultraviolet B treated marrow to engraftment following BMT in rodents without compromising the health of the animals under study. Chimaerism data from sequential tail vein samples and bone marrow from selected sacrificed animals showed excellent correlation, thus confirming the validity of this approach in analysing haemopoietic tissue. Thus the use of this assay may facilitate experimental studies in animal BMT.
Resumo:
Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.
Resumo:
The finding that serial recall perfonnance for visually presented items is impaired by concurrently presented speech or sounds is referred to as the irrelevant sound effect (lSE). The foremost explanation for the effect is based on interference with rehearsal and seriation processes. The present series of experiments demonstrates tha t neither rehearsal nor seriation processes is necessary to observe the ISE. Evidence comes from three experiments that a) allow participants to report to-be-remembered items in any order, b) eliminate rehearsal by engaging participants in a cover task and surprising them with a memory test, and c) show that surprise non-serial recognition is immune to rehearsal-based experimental manipulations that modulate the ISE in more typical serial recall tasks. Together,the results show that models that rely on rehearsal or seriation processes to account for the ISE need to be reconsidered. Results are discussed in tenns of interference with encoding of to-be-remembered material.
Resumo:
Several Authors Have Discussed Recently the Limited Dependent Variable Regression Model with Serial Correlation Between Residuals. the Pseudo-Maximum Likelihood Estimators Obtained by Ignoring Serial Correlation Altogether, Have Been Shown to Be Consistent. We Present Alternative Pseudo-Maximum Likelihood Estimators Which Are Obtained by Ignoring Serial Correlation Only Selectively. Monte Carlo Experiments on a Model with First Order Serial Correlation Suggest That Our Alternative Estimators Have Substantially Lower Mean-Squared Errors in Medium Size and Small Samples, Especially When the Serial Correlation Coefficient Is High. the Same Experiments Also Suggest That the True Level of the Confidence Intervals Established with Our Estimators by Assuming Asymptotic Normality, Is Somewhat Lower Than the Intended Level. Although the Paper Focuses on Models with Only First Order Serial Correlation, the Generalization of the Proposed Approach to Serial Correlation of Higher Order Is Also Discussed Briefly.
Resumo:
A group of agents participate in a cooperative enterprise producing a single good. Each participant contributes a particular type of input; output is nondecreasing in these contributions. How should it be shared? We analyze the implications of the axiom of Group Monotonicity: if a group of agents simultaneously decrease their input contributions, not all of them should receive a higher share of output. We show that in combination with other more familiar axioms, this condition pins down a very small class of methods, which we dub nearly serial.
Resumo:
We consider the problem of testing whether the observations X1, ..., Xn of a time series are independent with unspecified (possibly nonidentical) distributions symmetric about a common known median. Various bounds on the distributions of serial correlation coefficients are proposed: exponential bounds, Eaton-type bounds, Chebyshev bounds and Berry-Esséen-Zolotarev bounds. The bounds are exact in finite samples, distribution-free and easy to compute. The performance of the bounds is evaluated and compared with traditional serial dependence tests in a simulation experiment. The procedures proposed are applied to U.S. data on interest rates (commercial paper rate).
Resumo:
We o¤er an axiomatization of the serial cost-sharing method of Friedman and Moulin (1999). The key property in our axiom system is Group Demand Monotonicity, asking that when a group of agents raise their demands, not all of them should pay less.
Resumo:
Electron wave motion in a quantum wire with periodic structure is treated by direct solution of the Schrödinger equation as a mode-matching problem. Our method is particularly useful for a wire consisting of several distinct units, where the total transfer matrix for wave propagation is just the product of those for its basic units. It is generally applicable to any linearly connected serial device, and it can be implemented on a small computer. The one-dimensional mesoscopic crystal recently considered by Ulloa, Castaño, and Kirczenow [Phys. Rev. B 41, 12 350 (1990)] is discussed with our method, and is shown to be a strictly one-dimensional problem. Electron motion in the multiple-stub T-shaped potential well considered by Sols et al. [J. Appl. Phys. 66, 3892 (1989)] is also treated. A structure combining features of both of these is investigated
Resumo:
Resúmen basado en el del autor. Resúmen en castellano y en inglés
Resumo:
The clustering in time (seriality) of extratropical cyclones is responsible for large cumulative insured losses in western Europe, though surprisingly little scientific attention has been given to this important property. This study investigates and quantifies the seriality of extratropical cyclones in the Northern Hemisphere using a point-process approach. A possible mechanism for serial clustering is the time-varying effect of the large-scale flow on individual cyclone tracks. Another mechanism is the generation by one parent cyclone of one or more offspring through secondary cyclogenesis. A long cyclone-track database was constructed for extended October March winters from 1950 to 2003 using 6-h analyses of 850-mb relative vorticity derived from the NCEP NCAR reanalysis. A dispersion statistic based on the varianceto- mean ratio of monthly cyclone counts was used as a measure of clustering. It reveals extensive regions of statistically significant clustering in the European exit region of the North Atlantic storm track and over the central North Pacific. Monthly cyclone counts were regressed on time-varying teleconnection indices with a log-linear Poisson model. Five independent teleconnection patterns were found to be significant factors over Europe: the North Atlantic Oscillation (NAO), the east Atlantic pattern, the Scandinavian pattern, the east Atlantic western Russian pattern, and the polar Eurasian pattern. The NAO alone is not sufficient for explaining the variability of cyclone counts in the North Atlantic region and western Europe. Rate dependence on time-varying teleconnection indices accounts for the variability in monthly cyclone counts, and a cluster process did not need to be invoked.
Resumo:
The authors propose a bit serial pipeline used to perform the genetic operators in a hardware genetic algorithm. The bit-serial nature of the dataflow allows the operators to be pipelined, resulting in an architecture which is area efficient, easily scaled and is independent of the lengths of the chromosomes. An FPGA implementation of the device achieves a throughput of >25 million genes per second
Resumo:
The paper concerns the design and analysis of serial dilution assays to estimate the infectivity of a sample of tissue when it is assumed that the sample contains a finite number of indivisible infectious units such that a subsample will be infectious if it contains one or more of these units. The aim of the study is to estimate the number of infectious units in the original sample. The standard approach to the analysis of data from such a study is based on the assumption of independence of aliquots both at the same dilution level and at different dilution levels, so that the numbers of infectious units in the aliquots follow independent Poisson distributions. An alternative approach is based on calculation of the expected value of the total number of samples tested that are not infectious. We derive the likelihood for the data on the basis of the discrete number of infectious units, enabling calculation of the maximum likelihood estimate and likelihood-based confidence intervals. We use the exact probabilities that are obtained to compare the maximum likelihood estimate with those given by the other methods in terms of bias and standard error and to compare the coverage of the confidence intervals. We show that the methods have very similar properties and conclude that for practical use the method that is based on the Poisson assumption is to be recommended, since it can be implemented by using standard statistical software. Finally we consider the design of serial dilution assays, concluding that it is important that neither the dilution factor nor the number of samples that remain untested should be too large.