840 resultados para Directed Search
Resumo:
Classical treatments of problems of sequential mate choice assume that the distribution of the quality of potential mates is known a priori. This assumption, made for analytical purposes, may seem unrealistic, opposing empirical data as well as evolutionary arguments. Using stochastic dynamic programming, we develop a model that includes the possibility for searching individuals to learn about the distribution and in particular to update mean and variance during the search. In a constant environment, a priori knowledge of the parameter values brings strong benefits in both time needed to make a decision and average value of mate obtained. Knowing the variance yields more benefits than knowing the mean, and benefits increase with variance. However, the costs of learning become progressively lower as more time is available for choice. When parameter values differ between demes and/or searching periods, a strategy relying on fixed a priori information might lead to erroneous decisions, which confers advantages on the learning strategy. However, time for choice plays an important role as well: if a decision must be made rapidly, a fixed strategy may do better even when the fixed image does not coincide with the local parameter values. These results help in delineating the ecological-behavior context in which learning strategies may spread.
Resumo:
The use of tumor necrosis factor alpha (TNFalpha) in cancer therapy is limited by its short circulatory half-life and its severe systemic side effects. To overcome these limitations, we evaluated the capability of a bispecific antibody (BAb) directed against carcinoembryonic antigen (CEA) and human TNFalpha to target this cytokine in tumors. A BAb was constructed by coupling the Fab' fragments from an anti-CEA monoclonal antibody (MAb) to the Fab' fragments from an anti-TNFalpha MAb via a stable thioether linkage. The double specificity of the BAb for CEA and TNFalpha was demonstrated using a BIAcoreTM two-step analysis. The affinity constants of the BAb for CEA immobilized on a sensor chip and for soluble TNFalpha added to the CEA-BAb complex were as high as those of the parental MAbs (1.7 x 10(9) M-1 and 6.6 x 10(8) M-1, respectively). The radiolabeled 125I-labeled BAb retained high immunoreactivity with both CEA and TNFalpha immobilized on a solid phase. In nude mice xenografted with the human colorectal carcinoma T380, the 125I-labeled BAb showed a tumor localization and biodistribution comparable to that of 131I-labeled anti-CEA parental F(ab')2 with 25-30% of the injected dose (ID)/g tumor at 24 h and 20% ID/g tumor at 48 h. To target TNFalpha to the tumor, a two-step i.v. injection protocol was used first, in which a variable dose of 125I-labeled BAb was injected, followed 24 or 48 h later by a constant dose of 131I-labeled TNFalpha (1 microg). Mice pretreated with 3 microg of BAb and sacrificed 2, 4, 6, or 8 h after the injection of TNFalpha showed a 1.5- to 2-fold increased concentration of 131I-labeled TNFalpha in the tumor as compared to control mice, which received TNFalpha alone. With a higher dose of BAb (25 microg), mice showed a better targeting of TNFalpha with a 3.2-fold increased concentration of 131I-labeled TNFalpha in the tumor: 9.3% versus 2.9% ID/g in control mice 6 h after TNFa injection. In a one-step injection protocol using a premixed BAb-TNFalpha preparation, similar results were obtained 6 h postinjection (3.5-fold increased TNFalpha tumor concentration). A longer retention time of TNFalpha was observed leading to an 8.1-fold increased concentration of TNFalpha in the tumor 14 h postinjection (4.4 versus 0.5% ID/g tumor for BAb-treated and control mice, respectively). These results show that our BAb is able, first, to localize in a human colon carcinoma and, there, to immunoabsorb the i.v.-injected TNFalpha, leading to its increased concentration at the tumor site.
Resumo:
We have reported the identification of human gene MAGE-1, which directs the expression of an antigen recognized on a melanoma by autologous cytolytic T lymphocytes (CTL). We show here that CTL directed against this antigen, which was named MZ2-E, recognize a nonapeptide encoded by the third exon of gene MAGE-1. The CTL also recognize this peptide when it is presented by mouse cells transfected with an HLA-A1 gene, confirming the association of antigen MZ2-E with the HLA-A1 molecule. Other members of the MAGE gene family do not code for the same peptide, suggesting that only MAGE-1 produces the antigen recognized by the anti-MZ2-E CTL. Our results open the possibility of immunizing HLA-A1 patients whose tumor expresses MAGE-1 either with the antigenic peptide or with autologous antigen-presenting cells pulsed with the peptide.
Resumo:
BACKGROUND: The reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR) is a widely used, highly sensitive laboratory technique to rapidly and easily detect, identify and quantify gene expression. Reliable RT-qPCR data necessitates accurate normalization with validated control genes (reference genes) whose expression is constant in all studied conditions. This stability has to be demonstrated.We performed a literature search for studies using quantitative or semi-quantitative PCR in the rat spared nerve injury (SNI) model of neuropathic pain to verify whether any reference genes had previously been validated. We then analyzed the stability over time of 7 commonly used reference genes in the nervous system - specifically in the spinal cord dorsal horn and the dorsal root ganglion (DRG). These were: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) and L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) and hydroxymethylbilane synthase (HMBS). We compared the candidate genes and established a stability ranking using the geNorm algorithm. Finally, we assessed the number of reference genes necessary for accurate normalization in this neuropathic pain model. RESULTS: We found GAPDH, HMBS, Actb, HPRT1 and 18S cited as reference genes in literature on studies using the SNI model. Only HPRT1 and 18S had been once previously demonstrated as stable in RT-qPCR arrays. All the genes tested in this study, using the geNorm algorithm, presented gene stability values (M-value) acceptable enough for them to qualify as potential reference genes in both DRG and spinal cord. Using the coefficient of variation, 18S failed the 50% cut-off with a value of 61% in the DRG. The two most stable genes in the dorsal horn were RPL29 and RPL13a; in the DRG they were HPRT1 and Actb. Using a 0.15 cut-off for pairwise variations we found that any pair of stable reference gene was sufficient for the normalization process. CONCLUSIONS: In the rat SNI model, we validated and ranked Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 and 18S as good reference genes in the spinal cord. In the DRG, 18S did not fulfill stability criteria. The combination of any two stable reference genes was sufficient to provide an accurate normalization.
Resumo:
This paper presents an Optimised Search Heuristic that combines a tabu search method with the verification of violated valid inequalities. The solution delivered by the tabu search is partially destroyed by a randomised greedy procedure, and then the valid inequalities are used to guide the reconstruction of a complete solution. An application of the new method to the Job-Shop Scheduling problem is presented.
Resumo:
Iterated Local Search has many of the desirable features of a metaheuristic: it is simple, easy to implement, robust, and highly effective. The essential idea of Iterated Local Search lies in focusing the search not on the full space of solutions but on a smaller subspace defined by the solutions that are locally optimal for a given optimization engine. The success of Iterated Local Search lies in the biased sampling of this set of local optima. How effective this approach turns out to be depends mainly on the choice of the local search, the perturbations, and the acceptance criterion. So far, in spite of its conceptual simplicity, it has lead to a number of state-of-the-art results without the use of too much problem-specific knowledge. But with further work so that the different modules are well adapted to the problem at hand, Iterated Local Search can often become a competitive or even state of the artalgorithm. The purpose of this review is both to give a detailed description of this metaheuristic and to show where it stands in terms of performance.
Resumo:
To study the short-run and long-run implications on wage inequality, we introducedirected technical change into a Ricardian model of offshoring. A unique final good isproduced by combining a skilled and an unskilled product, each produced from a continuumof intermediates (tasks). Some of these tasks can be transferred from a skill-abundant Westto a skill-scarce East. Profit maximization determines both the extent of offshoring andtechnological progress. Offshoring induces skill-biased technical change because it increasesthe relative price of skill-intensive products and induces technical change favoring unskilledworkers because it expands the market size for technologies complementing unskilled labor.In the empirically more relevant case, starting from low levels, an increase in offshoringopportunities triggers a transition with falling real wages for unskilled workers in the West,skill-biased technical change and rising skill premia worldwide. However, when the extentof offshoring becomes sufficiently large, further increases in offshoring induce technicalchange now biased in favor of unskilled labor because offshoring closes the gap betweenunskilled wages in the West and the East, thus limiting the power of the price effectfueling skill-biased technical change. The unequalizing impact of offshoring is thus greatestat the beginning. Transitional dynamics reveal that offshoring and technical change aresubstitutes in the short run but complements in the long run. Finally, though offshoringimproves the welfare of workers in the East, it may benefit or harm unskilled workers inthe West depending on elasticities and the equilibrium growth rate.
Resumo:
Signal search analysis is a general method to discover and characterize sequence motifs that are positionally correlated with a functional site (e.g. a transcription or translation start site). The method has played an instrumental role in the analysis of eukaryotic promoter elements. The signal search analysis server provides access to four different computer programs as well as to a large number of precompiled functional site collections. The programs offered allow: (i) the identification of non-random sequence regions under evolutionary constraint; (ii) the detection of consensus sequence-based motifs that are over- or under-represented at a particular distance from a functional site; (iii) the analysis of the positional distribution of a consensus sequence- or weight matrix-based sequence motif around a functional site; and (iv) the optimization of a weight matrix description of a locally over-represented sequence motif. These programs can be accessed at: http://www.isrec.isb-sib.ch/ssa/.
Resumo:
In this paper we present an algorithm to assign proctors toexams. This NP-hard problem is related to the generalized assignmentproblem with multiple objectives. The problem consists of assigningteaching assistants to proctor final exams at a university. We formulatethis problem as a multiobjective integer program (IP) with a preferencefunction and a workload-fairness function. We then consider also a weightedobjective that combines both functions. We develop a scatter searchprocedure and compare its outcome with solutions found by solving theIP model with CPLEX 6.5. Our test problems are real instances from aUniversity in Spain.
Resumo:
This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.
Resumo:
Firms compete by choosing both a price and a design from a family of designs thatcan be represented as demand rotations. Consumers engage in costly sequential searchamong firms. Each time a consumer pays a search cost he observes a new offering. Anoffering consists of a price quote and a new good, where goods might vary in the extentto which they are good matches for the consumer. In equilibrium, only two design-styles arise: either the most niche where consumers are likely to either love or loathethe product, or the broadest where consumers are likely to have similar valuations. Inequilibrium, different firms may simultaneously offer both design-styles. We performcomparative statics on the equilibrium and show that a fall in search costs can lead tohigher industry prices and profits and lower consumer surplus. Our analysis is relatedto discussions of how the internet has led to the prevalence of niche goods and the"long tail" phenomenon.