942 resultados para sequential exploitation
Resumo:
The morphology and mechanical properties of polypropylene/high-density polyethylene (PP/HDPE) blends in a wide range of compositions modified by a sequential Ziegler-Natta polymerization product (PP-PE) have been investigated. PP-PE contains multiple components such as PP, ethylene-propylene copolymer (EPC), and high molecular weight polyethylene (HMWPE). The effects of PP-PE on the mechanical properties and morphology of the PP/HDPE blends are the aggregative results of all its individual components. Addition of PP-PE to the blends not only improved the tensile strength of the blends, but the elongation at break increased linearly while the moduli were nearly unchanged. Morphological studies show that the adhesion between the two phases in all the blends of different compositions is enhanced and the dispersed domain sizes of the blends are reduced monotonously with the increment of the content of PP-PE. PP-PE has been demonstrated to be a more effective compatibilizer than EPC. Based on these results, it can be concluded that the tensile strength of the blends depends most on the adhesion between the two phases and the elongation at break depends most on the domain size of the dispersed component. (C) 1995 John Wiley & Sons, Inc.
Resumo:
A group of natural diosgenyl saponins was synthesized in a highly efficient manner employing the 'one-pot sequential glycosylation' protocol with the combined use of glycosyl trichloroacetimidates and thioglycosides. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
Ebolaviruses (EBOVs) are among the most virulent and deadly pathogens ever known, causing fulminant haemorrhagic fevers in humans and non-human primates. The 2014 outbreak of Ebola virus disease (EVD) in West Africa has claimed more lives than all previous EVD outbreaks combined. The EBOV high mortality rates have been related to the virus-induced impairment of the host innate immunity reaction due to two virus-coded proteins, VP24 and VP35. EBOV VP35 is a multifunctional protein, it is essential for viral replication as a component of the viral RNA polymerase and it also participates in nucleocapsid assembly. Early during EBOV infection, alpha-beta interferon (IFN-α/β) production would be triggered upon recognition of viral dsRNA products by cytoplasmic retinoic acid-inducible gene I (RIG-I)-like receptors (RLRs). However, this recognition is efficiently prevented by the double-stranded RNA (dsRNA) binding activity of the EBOV VP35 protein, which hides RLRs binding sites on the dsRNA phosphate backbone as well the 5’-triphosphate (5’-ppp) dsRNA ends to RIG-I recognition. In addition to dsRNA binding and sequestration, EBOV VP35 inhibits IFN-α/β production preventing the activation of the IFN regulatory factor 3 (IRF-3) by direct interaction with cellular proteins. Previous studies demonstrated that single amino acid changes in the VP35 dsRNA binding domain reduce EBOV virulence, indicating that VP35 is an attractive target for antiviral drugs development. Within this context, here we report the establishment of a novel method to characterize the EBOV VP35 inhibitory function of the dsRNA-dependent RIG-I-mediated IFN-β signaling pathway in a BLS2 cell culture setting. In such system, a plasmid containing the promoter region of IFN-β gene linked with a luciferase reporter gene was transfected, together with a EBOV VP35 mammalian expression plasmid, into the IFN-sensitive A549 cell line, and the IFN-induction was stimulated through dsRNA transfection. Through alanine scanning mutational studies with biochemical, cellular and computational methods we highlighted the importance of some VP35 residues involved in dsRNA end-capping binding, such as R312, K282 and R322, that may serve as target for the development of small-molecule inhibitors against EBOV. Furthermore, we identified a synthetic compound that increased IFN-induction only under antiviral response stimulation and subverted VP35 inhibition, proving to be very attractive for the development of an antiviral drug. In conclusion, our results provide the establishment of a new assay as a straightforward tool for the screening of antiviral compounds that target i) dsRNA-VP35 or cellular protein-VP35 interaction and ii) dsRNA-dependent RIG-I-mediated IFN signaling pathway, in order to potentiate the IFN response against VP35 inhibition, setting the bases for further drug development.
Resumo:
English & Polish jokes based on linguistic ambiguity are constrasted. Linguistic ambiguity results from a multiplicity of semantic interpretations motivated by structural pattern. The meanings can be "translated" either by variations of the corresponding minimal strings or by specifying the type & extent of modification needed between the two interpretations. C. F. Hockett's (1972) translatability notion that a joke is linguistic if it cannot readily be translated into other languages without losing its humor is used to interpret some cross-linguistic jokes. It is claimed that additional intralinguistic criteria are needed to classify jokes. By using a syntactic representation, the humor can be explained & compared cross-linguistically. Since the mapping of semantic values onto lexical units is highly language specific, translatability is much less frequent with lexical ambiguity. Similarly, phonological jokes are not usually translatable. Pragmatic ambiguity can be translated on the basis of H. P. Grice's (1975) cooperative principle of conversation that calls for discourse interpretations. If the distinction between linguistic & nonlinguistic jokes is based on translatability, pragmatic jokes must be excluded from the classification. Because of their universality, pragmatic jokes should be included into the linguistic classification by going beyond the translatability criteria & using intralinguistic features to describe them.
Resumo:
An exceptional concentration of almost identical depressions exist near the small towns of Krotoszyn, Koźmin and Raszków (southern Wielkopolska). Their origin is, however, different from that of the typical post glacial-relief: they are Man-made enlarged thermal-contraction structures that developed at the very end of the Middle Polish (Warthian) glaciation and during the North Polish (Weichselian) glaciation, most probably under periglacial conditions.
Resumo:
To serve asynchronous requests using multicast, two categories of techniques, stream merging and periodic broadcasting have been proposed. For sequential streaming access where requests are uninterrupted from the beginning to the end of an object, these techniques are highly scalable: the required server bandwidth for stream merging grows logarithmically as request arrival rate, and the required server bandwidth for periodic broadcasting varies logarithmically as the inverse of start-up delay. However, sequential access is inappropriate to model partial requests and client interactivity observed in various streaming access workloads. This paper analytically and experimentally studies the scalability of multicast delivery under a non-sequential access model where requests start at random points in the object. We show that the required server bandwidth for any protocols providing immediate service grows at least as the square root of request arrival rate, and the required server bandwidth for any protocols providing delayed service grows linearly with the inverse of start-up delay. We also investigate the impact of limited client receiving bandwidth on scalability. We optimize practical protocols which provide immediate service to non-sequential requests. The protocols utilize limited client receiving bandwidth, and they are near-optimal in that the required server bandwidth is very close to its lower bound.
Resumo:
A model which extends the adaptive resonance theory model to sequential memory is presented. This new model learns sequences of events and recalls a sequence when presented with parts of the sequence. A sequence can have repeated events and different sequences can share events. The ART model is modified by creating interconnected sublayers within ART's F2 layer. Nodes within F2 learn temporal patterns by forming recency gradients within LTM. Versions of the ART model like ART I, ART 2, and fuzzy ART can be used.
Resumo:
BACKGROUND AND PURPOSE: Docetaxel is an active agent in the treatment of metastatic breast cancer. We evaluated the feasibility of docetaxel-based sequential and combination regimens as adjuvant therapies for patients with node-positive breast cancer. PATIENTS AND METHODS: Three consecutive groups of patients with node-positive breast cancer or locally-advanced disease, aged < or = 70 years, received one of the following regimens: a) sequential A-->T-->CMF: doxorubicin 75 mg/m2 q 3 weeks x 3, followed by docetaxel 100 mg/m2 q 3 weeks x 3, followed by i.v. CMF days 1 + 8 q 4 weeks x 3; b) sequential accelerated A-->T-->CMF: A and T were administered at the same doses q 2 weeks; c) combination therapy: doxorubicin 50 mg/m2 + docetaxel 75 mg/m2 q 3 weeks x 4, followed by CMF x 4. When indicated, radiotherapy was administered during or after CMF, and tamoxifen started after the end of CMF. RESULTS: Seventy-nine patients have been treated. Median age was 48 years. A 30% rate of early treatment discontinuation was observed in patients receiving the sequential accelerated therapy (23% during A-->T), due principally to severe skin toxicity. Median relative dose-intensity was 100% in the three treatment arms. The incidence of G3-G4 major toxicities by treated patients, was as follows: skin toxicity a: 5%; b: 27%; c: 0%; stomatitis a: 20%; b: 20%; c: 3%. The incidence of neutropenic fever was a: 30%; b: 13%; c: 48%. After a median follow-up of 18 months, no late toxicity has been reported. CONCLUSIONS: The accelerated sequential A-->T-->CMF treatment is not feasible due to an excess of skin toxicity. The sequential non accelerated and the combination regimens are feasible and under evaluation in a phase III trial of adjuvant therapy.
Resumo:
BACKGROUND: Docetaxel has proven efficacy in metastatic breast cancer. In this pilot study, we explored the efficacy/feasibility of docetaxel-based sequential and combination regimens as adjuvant therapy of node-positive breast cancer. PATIENTS AND METHODS: From March 1996 till March 1998, four consecutive groups of patients with stages II and III breast cancer, aged < or = 70 years, received one of the following regimens: a) sequential Doxorubicin (A) --> Docetaxel (T) --> CMF (Cyclophosphamide+Methotrexate+5-Fluorouracil): A 75 mg/m q 3 wks x 3, followed by T100 mg/m2 q 3 wks x 3, followed by i.v. CMF Days 1+8 q 4 wks x 3; b) sequential accelerated A --> T --> CMF: A and T administered at the same doses q 2 wks with Lenograstin support; c) combination therapy: A 50 mg/m2 + T 75 mg/m2 q 3 wks x 4, followed by CMF x 4; d) sequential T --> A --> CMF: T and A, administered as in group a), with the reverse sequence. When indicated, radiotherapy was administered during or after CMF, and Tamoxifen after CMF. RESULTS: Ninety-three patients were treated. The median age was 48 years (29-66) and the median number of positive axillary nodes was 6 (1-25). Tumors were operable in 94% and locally advanced in 6% of cases. Pathological tumor size was >2 cm in 72% of cases. There were 21 relapses, (18 systemic, 3 locoregional) and 11 patients (12%) have died from disease progression. At median follow-up of 39 months (6-57), overall survival (OS) was 87% (95% CI, 79-94%) and disease-free survival (DFS) was 76% (95% CI, 67%-85%). CONCLUSION: The efficacy of these docetaxel-based regimens, in terms of OS and DFS, appears to be at least as good as standard anthracycline-based adjuvant chemotherapy (CT), in similar high-risk patient populations.
Resumo:
This paper describes a methodology for detecting anomalies from sequentially observed and potentially noisy data. The proposed approach consists of two main elements: 1) filtering, or assigning a belief or likelihood to each successive measurement based upon our ability to predict it from previous noisy observations and 2) hedging, or flagging potential anomalies by comparing the current belief against a time-varying and data-adaptive threshold. The threshold is adjusted based on the available feedback from an end user. Our algorithms, which combine universal prediction with recent work on online convex programming, do not require computing posterior distributions given all current observations and involve simple primal-dual parameter updates. At the heart of the proposed approach lie exponential-family models which can be used in a wide variety of contexts and applications, and which yield methods that achieve sublinear per-round regret against both static and slowly varying product distributions with marginals drawn from the same exponential family. Moreover, the regret against static distributions coincides with the minimax value of the corresponding online strongly convex game. We also prove bounds on the number of mistakes made during the hedging step relative to the best offline choice of the threshold with access to all estimated beliefs and feedback signals. We validate the theory on synthetic data drawn from a time-varying distribution over binary vectors of high dimensionality, as well as on the Enron email dataset. © 1963-2012 IEEE.
Resumo:
A popular way to account for unobserved heterogeneity is to assume that the data are drawn from a finite mixture distribution. A barrier to using finite mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive separability of the log-likelihood function. We show, however, that an extension of the EM algorithm reintroduces additive separability, thus allowing one to estimate parameters sequentially during each maximization step. In establishing this result, we develop a broad class of estimators for mixture models. Returning to the likelihood problem, we show that, relative to full information maximum likelihood, our sequential estimator can generate large computational savings with little loss of efficiency.
Resumo:
We conduct the first empirical investigation of common-pool resource users' dynamic and strategic behavior at the micro level using real-world data. Fishermen's strategies in a fully dynamic game account for latent resource dynamics and other players' actions, revealing the profit structure of the fishery. We compare the fishermen's actual and socially optimal exploitation paths under a time-specific vessel allocation policy and find a sizable dynamic externality. Individual fishermen respond to other users by exerting effort above the optimal level early in the season. Congestion is costly instantaneously but is beneficial in the long run because it partially offsets dynamic inefficiencies.
Resumo:
BACKGROUND: Some of the 600,000 patients with solid organ allotransplants need reconstruction with a composite tissue allotransplant, such as the hand, abdominal wall, or face. The aim of this study was to develop a rat model for assessing the effects of a secondary composite tissue allotransplant on a primary heart allotransplant. METHODS: Hearts of Wistar Kyoto rats were harvested and transplanted heterotopically to the neck of recipient Fisher 344 rats. The anastomoses were performed between the donor brachiocephalic artery and the recipient left common carotid artery, and between the donor pulmonary artery and the recipient external jugular vein. Recipients received cyclosporine A for 10 days only. Heart rate was assessed noninvasively. The sequential composite tissue allotransplant consisted of a 3 x 3-cm abdominal musculocutaneous flap harvested from Lewis rats and transplanted to the abdomen of the heart allotransplant recipients. The abdominal flap vessels were connected to the femoral vessels. No further immunosuppression was administered following the composite tissue allotransplant. Ten days after composite tissue allotransplantation, rejection of the heart and abdominal flap was assessed histologically. RESULTS: The rat survival rate of the two-stage transplant surgery was 80 percent. The transplanted heart rate decreased from 150 +/- 22 beats per minute immediately after transplant to 83 +/- 12 beats per minute on day 20 (10 days after stopping immunosuppression). CONCLUSIONS: This sequential allotransplant model is technically demanding. It will facilitate investigation of the effects of a secondary composite tissue allotransplant following primary solid organ transplantation and could be useful in developing future immunotherapeutic strategies.
Resumo:
The requirement for a very accurate dependence analysis to underpin software tools to aid the generation of efficient parallel implementations of scalar code is argued. The current status of dependence analysis is shown to be inadequate for the generation of efficient parallel code, causing too many conservative assumptions to be made. This paper summarises the limitations of conventional dependence analysis techniques, and then describes a series of extensions which enable the production of a much more accurate dependence graph. The extensions include analysis of symbolic variables, the development of a symbolic inequality disproof algorithm and its exploitation in a symbolic Banerjee inequality test; the use of inference engine proofs; the exploitation of exact dependence and dependence pre-domination attributes; interprocedural array analysis; conditional variable definition tracing; integer array tracing and division calculations. Analysis case studies on typical numerical code is shown to reduce the total dependencies estimated from conventional analysis by up to 50%. The techniques described in this paper have been embedded within a suite of tools, CAPTools, which combines analysis with user knowledge to produce efficient parallel implementations of numerical mesh based codes.
Resumo:
In this paper, we study a problem of scheduling and batching on two machines in a flow-shop and open-shop environment. Each machine processes operations in batches, and the processing time of a batch is the sum of the processing times of the operations in that batch. A setup time, which depends only on the machine, is required before a batch is processed on a machine, and all jobs in a batch remain at the machine until the entire batch is processed. The aim is to make batching and sequencing decisions, which specify a partition of the jobs into batches on each machine, and a processing order of the batches on each machine, respectively, so that the makespan is minimized. The flow-shop problem is shown to be strongly NP-hard. We demonstrate that there is an optimal solution with the same batches on the two machines; we refer to these as consistent batches. A heuristic is developed that selects the best schedule among several with one, two, or three consistent batches, and is shown to have a worst-case performance ratio of 4/3. For the open-shop, we show that the problem is NP-hard in the ordinary sense. By proving the existence of an optimal solution with one, two or three consistent batches, a close relationship is established with the problem of scheduling two or three identical parallel machines to minimize the makespan. This allows a pseudo-polynomial algorithm to be derived, and various heuristic methods to be suggested.